- Having 12 years of IT experience with expertise in analysis, design, development and implementation of Data Warehousing applications using Teradata utilities, UNIX, Teradata SQL Assistant 13.0, ETL and BI tools
- Expertise in OLTP/OLAP System Study, Analysis and E - R modeling, developing Database Schemas like Star schema and Snowflake schema (Fact Tables, Dimension Tables) used in relational, dimensional and multidimensional modeling
- Proficient in converting logical data models to physical database designs in Data warehousing Environment and in-depth understanding in Database Hierarchy, Data Integrity concepts and Data Analysis
- Expert in Coding Teradata SQL, Teradata Stored Procedures, Macros and Triggers
- Involved in full lifecycle of various projects, including requirement gathering, system designing, application development, enhancement, deployment, maintenance and support
- Worked in migration of Teradata 12 to Teradata 13 and Teradata 13.10 to Teradata 14.
- Conducting code walk-through and reviewing internal and external quality assurance in the applications, debugging the defects identified and fixing them, comparing the test results with production results to make sure the changes are effective.
- Strong working experience in planning and carrying Data Warehousing, MPP and Large-Scale Database Management
- Through knowledge in Teradata and Oracle RDBMS Architecture.
- Experienced in troubleshooting Teradata scripts, fixing bugs and addressing production issues and performance Developed Test Scripts, Test Cases, and SQL QA Scripts to perform System Testing and Load Testing.
- Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant
- Expertise in working with various transformations using Aggregator, Look Up, Update Strategy, Joiner, Filter, Sequence Generator, Normalizer, Sorter, Router transformations in Informatica Power Center.
- Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics
- Effectively worked with Teradata TPT Connection Objects, View Point
- Certified Teradata consultant with experience in Teradata Physical implementation and Database Tuning
- Proven track record in delivering effective design documents, code deliverables, test strategies and business value to the customer.
- Having Expertise knowledge in Teradata and has good exposure on other databases like SQL Server, DB2.
- Strong expertise in physical modeling with knowledge to use Primary, Secondary, PPI and join Index.
- Strong experience in allocating work, guiding and leading team of developers in on site offshore model
- Expertise in building Enterprise Data Warehouse from wide variety of source systems and Data Marts sourced from EDW.
- Experienced in tuning Teradata SQL using Temporary tables and indexes
- Extensive experience in loading high volume data, requirements gathering, analysis, design, development, test and implementation.
- Experienced in Extracting Data from Mainframes Flat File and converting them into Teradata tables using SAS PROC IMPORT, PROC SQL etc.
- Experienced working on large volume of Data using Teradata SQL and Mainframes.
- Experienced in Extracting Data from Mainframes Flat File and converting them into Teradata tables using Teradata utilities like MLOAD,TPUMP,FASTLOAD, BTEQ and similarly load data to files from Tables using FASTEXPORT
- Strong Experience in working on MS WORD, MS Excel and MS Power Point.
- Good experience in Production Support, Identifying root causes, Trouble shooting, Mainframe Jobs maintenance and Submitting Change Controls.
- Experienced in Preparing Training Material and Conducting Training Sessions to the new trainees.
- Excellent ability with competitive spirit in finding various approaches for problem solving and highly skillful, creative and innovative in developing efficient logic/code.
- Worked on Agile Methodology and the scrum process.
- Expertise in Writing Shell Scripts.
- Expertise in working with various operational sources like Oracle and SQL Sever
- Proficiency in prioritizing and multi-tasking to ensure that tasks are completed on time
- Demonstrate a willingness, interest, and aptitude to learn new technologies and skills.
- Good Communication and interpersonal skills.
Programming Languages: JCL, SQL Server, COBOL, Easytrieve, SAS
Specialised Software: Teradata RDBMS, TSO/ISPF, CA7, Changeman, File Aid, DCCS, QMF Endeavour, Informatica & Control M, Teradata SQL Assistant Maximo Service Centre.
Packages: MS Word, MS Excel, MS Power-point, MS Project, MS Visio, SharePoint
Operating Systems: IBM S390 Mainframes, Windows NT/2000/XP/7 professional .
Teradata: 12 Basics Certification - TE0121
Teradata: V2R5 Basics Certification
Confidential, Plano Texas
Sr. Teradata Developer
- Expert in Designing the flow and process of CDM process.
- Validation of the data with respect to the Target CDM (Change Data Management) process.
- Uploaded Data Model in Reveleus for every change request implementation.
- Worked on key management reports like data quality metrics, data validation metrics, trend analysis for a range of period, AFS and comparison reports using business objects XI R2
- Received the sales data from stores and loaded that into data warehouse.
- Experienced in migration team to fix the deployment issues in the Production environment.
- Written Unix Shell script for archiving the feeds, executing the batch using Autosys.
Performance tuning of data-mart, BO universe and Reports.
- Expert in Retrieving data from Teradata database.
- Worked on SQL performance tuning and also handled critical spool space errors in database.
- Implemented quick fix and suggested a Permanent fix to the other Teams as per the Business need.
- Implemented Data fix on database as per the User Request for the Business.
- Generated the adhoc data as excel Reports to Users (through TeradataSQl).
- Worked on Root cause analysis on Production Support issues.
- Created UNIX shell scripts as per the business requirement.
- Expert in gathering, analyzing, and documenting business requirements, functional requirements and data specifications for Business Objects Universes and Reports.
- Worked on ETL tools such as Informatica to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
- Transfer of large volumes of data using Teradata FastLoad, MultiLoad and T-Pump.
- Worked on the SQL tuning and optimization of the Business Objects reports.
- Applied security features of Business Objects like row level, report level & object level security in the universe so as to make the sensitive data secure.
- Implemented Informatica transformations such as Source Qualifier, Aggregator, Expressions, Look up Filters and Sequence Generator.
- Developed simple & complex mappings using Designer to load dimension and fact tables as per STAR schema techniques.
Environment: Teradata 13.0, Teradata SQL Assistant, Teradata Manager, BTEQ, TPT, MLOAD, FLOAD, FASTEXPORT, UNIX Shell Scripts, Ab Initio, Tableau, Business Objects Universes.
Sr. Teradata Developer
- Developed SQL join indexes to solve strategic and tactical queries.
- Populating FastLoad and MultiLoad tables using different data load and unload utilities of Teradata.
- Worked with collect statistics and join indexes.
- Involved in unit testing, systems integration and user acceptance testing.
- Worked on Preparing Test Cases and performing Testing.
- Generated no of ADHOC reports as per the Client requirements.
- Used Multiset tables to load bulk data in the forms of inserts and deletes.
- Created indexes, joins on tables as per requirements.
- Gathering the required information from the users and preparing the design documents.
- Interacted with different system groups for analysis of systems.
- Created tables, views in Teradata, according to the requirements.
- Developing Bteq, MultiLoad and FastLoad scripts to populate data into Teradata database.
- Created various types of temporary tables such as volatile and global temporary tables.
Environment: Teradata R13, Informatica, AIX, BTEQ, TPT, MLOAD, FLOAD, FASTEXPORT, UNIX Shell Scripts
Sr. Teradata Developer
- Analyzed source data and gathered requirements from the business users.
- Widely involved in modeling of semantic layer.
- Built parameterized shell scripts for ETL jobs and scheduler jobs in Windows.
- Build tables, views, UPI, NUPI, USI and NUSI.
- Designed jobs by using Informatica for extracting, transforming, and loading (ETL) data from heterogeneous sources into a data warehouse.
- Gathered and managed requirements to execute successful development of data warehouse solutions
- Created ERWIN reports in RTF format depending upon the requirement, published Data model in model mart, created naming convention files, co-coordinated with DBAs to apply the data model changes
- Leading the technical team and off-shore resources.
- Involved in the logical and physical design of the database and creation of the database objects.
- Workings on different transformations for match and merge consolidation process.
- Involved in analyzing existing logical and physical data modeling
- Worked with application developers and production teams across functional units to identify business needs and discuss solution options.
- Defined Framework Models, Packages, Business Views and other objects of Cognos 10.1.
- Query optimization (explain plans, collect statistics, Primary and Secondary indexes).
- Extensively worked in the performance tuning of transformations, Sources, Sessions, Mappings and Targets.
- Responsible for preparing ETL strategies for extracting data from different data sources.
- Worked extensively on forward and reverse engineering processes. Created DDL scripts for implementing Data Modeling changes.
- Responsible for the performance tuning of end user reports.
- Oversaw and managed team resources to ensure deliverables met business requirements and deadlines
- Gathering requirements and responsible for off-shore development work.
- Developed mappings, workflows, and performed extensive troubleshooting in Informatica 9.5.1
- Widely used batch scripting in windows to support ETL and BTEQ scripts.
- Worked on reusable sources, targets, transformations, mappings, sessions etc.
- Documenting and updating the deployment plan, test plans, design plans, production support plan etc.
- Experienced in using Teradata MLOAD, FLOAD, and FASTEXPORT.
Environment: Teradata 13.10, Teradata SQL Assistant, UNIX, Teradata Utilities.
- To do Data Analysis on the New/Changed Requirements as per the client specifications.
- Extract the data from the flat files and other RDBMS databases into staging area, apply transformations in working area and Load data onto Target Tables of Data warehouse.
- Developed numerous Teradata SQL Queries by creating SET or MULTISET Tables, Views, Global and Volatile Tables, using Inner and Outer Joins, Using Date Function, String Function and Advanced techniques like RANK and ROW NUMBER functions.
- Tuning of SQL to Optimize the Performance, Spool Space Usage and CPU usage.
- Worked on testing primary index and skew ratios before populating data into tables using sampling techniques, Explain plan in Teradata before querying large tables with several joins.
- Performed tuning and optimization of complex SQL queries using Teradata Explain.
- Worked heavily in writing complex SQL queries to pull the required information from Database using Teradata SQL Assistance.
- Involved in troubleshooting the production issues and providing production support.
- Provide technical second-level product support for company’s Application. Coordinates with customers to resolve technical support issues.
- Responsible for tracking of trouble tickets and resolution of escalated support incidents. Answers technical support queries and implements fixes for application problems. Work on cross-functional teams to proactively address support issues.
- Creates and maintains documentation for supported applications.
Environment: IBM 3090, MVS, COBOL, JCL, SQL, TSO/ISPF, FILE-AID, CA7, Maximo Service Centre, Changeman, NDM, Teradata SQL Assistant 12.0, Teradata SQL Assistant 7.0, Teradata V2R5
- To do an impact Analysis for the New/Changed Requirements and prepare LLD (Low Level Design).
- Compare Client Supplied products like BRD, HLD with the LLD to find out any incompleteness.
- Prepare UTP and UTS.
- Develop the code according to the LLD using Confidential tool like FILE-AID, SAR, JCL and DCCN to ease the daily performed work effort.
- Standardization of the code work as per the requirements using Confidential tool ASA (Automated Standard Analyzer).
- Perform Peer Review and Code Walkthrough
- Perform SIT (System Integration Testing) and provide data to Onshore Counterpart.
- Assist in Deployment and provide Technical & Operational support during Install.
- Involve in Post implementation support.
- Tracking of project status, generating reports and conducting Daily Performance Meetings to reduce defects and improvement of productivity.
Environment: IBM 3090, MVS, COBOL, JCL, SQL, TSO/ISPF, FILE-AID, CA7, DCCS, NDM, QMF.