- Over 7+ years of experience in information technology and have Teradata experience utilizing tools, analyzing business needs of clients, developing effective and efficient solutions and ensuring client deliverables.
- Extensive knowledge and good achievements in Teradata query and application tuning/optimization.
- Setup and reviewing workload management process, giving solutions to various customer concerns on workload management set up and providing Teradata’s best practices.
- Worked in several areas of Data Warehouse including Analysis, Requirement Gathering, Design, Development, Testing, and Implementation.
- Experience in design and setup BI standards, best practices and rules for DWBI environment on enterprise level.
- Good knowledge of Dimensional Data Modeling, Star Schema, Snow - Flake schema, FACT and Dimensions Tables.
- Around 4+ years of strong data warehousing experience using Informatica Power Mart 6.1/5.1/4.7, Power Center 9.5.0/9.1.0/8.6.1/8.1/7.1.3/7.0/6.2/5.1 , Power Exchange, Power Connect as ETL tool.
- Worked in remediation (Performance Tuning) team for improving query performance of adhoc user queries and production queries (ETL & Reporting).
- Strong knowledge of Extraction Transformation and Loading (ETL) processes using UNIX shell scripting, SQL Loader.
- Expertise in performance tuning and query optimization of the Teradata SQLs. Experience in Designing, developing data conversion/data integration modules to support large scale system migrations and implementations.
- Expertise in Report formatting, Batch processing, and Data Loading and Export using BTEQ.
- Decent knowledge on using TD Administrator, FASTLOAD, MULTILOAD, TSET, TPUMP, SQL, PDCR, ARCMAIN,TASM for workload management.
- Have strong expertise in Teradatadevelopment and indices (PI, SI, PARTITION, JOIN INDEX) etc.
- Worked on Data Mining techniques for identifying claims on historical data.
- Expertise in database programming like writing Stored Procedures (SQL), Functions, Triggers, Views in Teradata, DB2 & MS Access.
- Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for ad hoc purpose to extract data for several business users on scheduled basis.
- Well versed in handling face to face meetings with Business Users, Business Stakeholders, Technical team members to gather and analyze business requirements, outline the proposed solution
- Extensively involved in Data Loading activities: created many Informatica mappings to source data from Upstream
- Expert in tuning performance of SQL queries and ETL process.
- Design, development and coding with Teradata
- Expertise in OLTP/OLAP System Study, Analysis and E-R modelling, developing Database Schemas like Star schema and Snowflake schema (Fact Tables, Dimension
Tools: & Utilities: TPT, BTEQ, Fast Load, Multiload, Tpump, Fast Export, SQL Assistant, Teradata Viewpoint, TSET, Teradata mapping manager
ETL Tools: Informatica 10.2.0/9.6.1 /9.5.0/ 9.1.0/ 8.6.1/ 8.1 /7.1.3/7.0/6.2/5.1 , Power centre, Hadoop 2.6.2 (HDFS, MapReduce, Hive, Sqoop, Pig)., Datastage8.7
Operating Systems: UNIX, Windows 7/8/10/XP/2000, Windows NT 4.0 and Z/OS and MVS (OS/390).
Databases: Teradata V2R5/V2R6, V12, V14, V15, Oracle (8i/9i),(10g/11g)SQL Server 2000 /2005 /2008/ 2012/2014/2016 DB2.
Data warehouse: Teradata, Oracle, SQL Server
Data Modelling: Erwin, Visio, Autosys
Applications: MSWord, Excel,Outlook,FrontPage, PowerPoint, MS- Visio
Reporting Tools: Micro Strategy 9.2.1/9.0/8i/7.i.
Support Tools: Tivoli, Impact, Autosys, Control-M, UC4, HPQC
Other Languages/ Technologies / Platforms: SQL, JAVA, VB, C#, UNIX, C, C++, Shell scripting (K-shell, C-Shell), SQL.
Confidential, Bloomfield, CT
- Worked in all the stages of the SDLC like Business, Functional, Technical Requirements Gathering, Designing, Documenting, Developing and Testing.
- Involved heavily in writing complex SQL queries based on the given requirements such as complex Teradata Joins, Stored Procedures, Macros, etc.
- Worked onTeradataStore procedures and functions to confirm the data and load it on the table.
- Worked in the conversion of SQL server functions into theTeradataStore procedure for confirming the data.
- The financial data mainly included Accounts Payables, Account Receivables, Budget, Fixed Assets, General Accounting, Procurement, Treasury Accounting, Performance Budgeting etc.
- Created, optimized, reviewed, and executedTeradataSQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
- Wrote numerous BTEQ scripts to run complex queries on theTeradatadatabase.
- Analyzed the system for the functionality required as per the requirements and created System Requirement Specification document (Functional Requirement Document).
- Manage and support production/development/test servers, databases, Data Warehouses on standalone, clustered, (AWS,Azure) cloud environments
- Designed and used shell scripts that automate thedatastagejobs and validate files.
- Extensively usedInformaticaClient tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets Designer,InformaticaRepository Manager andInformaticaWorkflow Manager.
- Developed various complex mappings using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
- Optimized and Tuned SQL queries used in the source qualifier of certain mappings to eliminate Full Table scan.
- Created and Configured Workflows, Work lets and Sessions to transport the data to target warehouse tables usingInformaticaWorkflow Manager.
- Created various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.
- Involved in the administration of theInformaticaRepository and in the design and architecture of multiple repositories to support multiple application development areas.
- Created, Tested and debugged the Stored Procedures, Functions, Packages, Cursors and triggers using PL/SQLdeveloper.
- Developed UNIX shell Scripts to generate parameter files and executed oracle procedures as batch jobs.
- Created scripts for Batch test and set the required options for overnight, automated execution of test scripts.
- Created Shell Script to send the data to downstream systems using FTP / SFTP.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, Fast Load, Multiload, Tpump.
- Wrote complex SQL’s and used volatile tables for code modularity and to improve performance of the query.
- Created Data Mapping Documents as part of the Data migration projects.
- Worked closely with the Business to understand their needs and provided my inputs to address their needs and requirements.
Environment: Teradata 15/14, Microsoft SQL Server, UNIX, Informatica, Tpump, Fast load, BTEQ, Power center, MLoad, MS Office Tools, Erwin, Jira, Tableau.
Confidential, West Chester PA
Teradata/ ETL Consultant
- Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
- Extracted Source data from REST API using UNIX shell scripts.
- Analysing the requirements and the existing environment to help come up with the right strategy to load and Extract on Data warehouse.
- Prepared the ETL specifications, Mapping document, ETL framework specification
- Implement slowly changing Dimension logics in the mapping to effectively handle change data capture which is typical in data warehousing systems.
- Prepared functional and technical specific, design documents.
- Responsible for data profiling, data cleansing and data conformation
- Actively participated in code migration process to higher environment and documents creation for the same.
- Created a BTEQ scripts for preloading of the work tables prior to main load process
- Proficient in understanding Teradata EXPLAIN plans, Collect Stats option, Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Volatile, global temporary, derived tables etc.
- Reviewed the SQL for missing joins, join constraints, data format issues, miss-matched aliases and casting errors.
- Use of Teradata Manager, BTEQ, FASTLOAD, MULTILOAD, TPUMP, SQL and TASM for workload management.
- Wrote various TPT scripts for ad hoc requirements and used tdload for exporting data from one environment to other environment using TPT
- Perform workload management using various tools like Teradata Manager, Fast Load, Multi Load, TPUMP, TPT, SQL Assistant.
- Developed the scripts using Teradata Parallel Transporter and implemented the Extraction- Transformation-Loading of data with TPT.
- Unit Testing, Database testing, Data verification and validation of the code developed and compare the results with Sybase and Teradata systems.
- Identify the performance bottlenecks in the production processes and identify key places where SQL can be tuned to improve overall performance of the production process
- Wrote numerous BTEQ scripts to run complex queries on the Teradata database.
- Developed UNIX scripts to automate different tasks involved as part of loading process.
Environment: Teradata 15, Teradata SQL Assistant, REST API,TPT, BTEQ, MLOAD, FLOAD, FASTEXPORT, Erwin Designer, Informatica 9.5, Tableau, POWER BI, UNIX, Korn Shell scripts.
Confidential, EL Segundo, CA
ETL /Teradata Developer
- Understanding the specification and analyzed data according to client requirement.
- Creating roles and profiles as needed basis. Granting privileges to roles, adding users to roles based on requirements.
- Managing database space, allocating new space to database, moving space between databases as needed basis.
- Assist developers, DBAs in designing, architecture, development and tuning queries of the project. This included modification of queries, Index selection, and refresh statistic collection.
- Proactively monitoring bad queries, aborting bad queries using PMON, looking for blocked sessions and working with development teams to resolve blocked sessions.
- Proactively monitoring database space, identifying tables with high skew, working with data modeling team to change the Primary Index on tables with High skew.
- Worked on moving tables from test to production using fast export and fast load.
- Extensively worked with DBQL data to identify high usage tables and columns.
- Implemented secondary indexes on frequently used columns to improve efficiency in retrieval.
- Worked on exporting data to flat files using TeradataFastExport.
- Worked exclusively on TeradataSQL Assistant to interface with the Teradata.
- Written several TeradataBTEQ scripts to implement the business logic.
- Populated data into Teradatatables by using Fast Load utility.
- Created Teradatacomplex macros, Views and stored procedures to be used in the reports.
- Perform error handling and performance tuning in Teradata queries and utilities.
- Creating error log tables for bulk loading.
- Actively involved in the TASM workload management setup across the organization. To define TASM Workloads, helped to develop TASM exceptions by contributing workload dynamics from development stand point.
- Worked on capacity planning, reported disk and CPU Usage growth reports usingTeradata Manager, DBQL and re-usage.
- Used TeradataManager collecting facility to setup AMP usage collection, canary query response, spool usage response etc.
- Developed complex mappings using multiple sources and targets in different databases, flat files.
- Developed TeradataBTEQ scripts. Automated Workflows and BTEQ scripts
- Query optimization (explain plans, collect statistics, Primary and Secondary indexes).
Environment: Teradata 12, Cron-tab, Austosys, PMON, BMC Remedy, TARA, Symantec NetBackup, Teradata Administrator, Teradata SQL Assistant, BTEQ, FastExport,Fast Load, Subversion, Informatica.
- Created Star Schema model with required facts and dimensions.
- Defined various facts and Dimensions in the data mart including Aggregate and Summary facts.
- Converted DataMart from Logical to Physical design, defined data types, Constraints, Indexes, generated Schema in Database, created automated scripts, defined storage parameters for objects in the Database.
- Created and Maintained Teradata Tables, Views, Macros, Triggers and Stored Procedures
- Coding using Teradata Analytical functions, BTEQ SQL of TERADATA, write UNIX scripts to validate, format and execute the SQLs on UNIX environment.
- Developed processes on both Teradata and Oracle using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, BTEQ (Teradata) and SQL*Plus, SQL*Loader (Oracle).
- Worked on complex queries to map the data as per the requirements.
- Populate or refresh Teradata tables using Fast load, Multi load & TPump utilities for user acceptance testing and loading history data into Teradata.
- Created UNIX Scripts for triggering the Stored Procedures and Macro.
- Involved in Performance tuning for the long running queries.
Environment: Teradata 12, Teradata SQL Assistant, BTEQ, FLoad, FExport, MLoad, TPT, TPump Erwin4.1.4, Quest Toad 9.3, UNIX Shell Scripting, SQL*Loader, Smart Putty, SQL Server, Windows XP, UNIX.
- Understanding the client specifications and validating the same against the data model.
- Develop BTEQ scripts based on the technical specifications for loading the tables.
- Understanding and documenting the dependencies between the different tables from a referential integrity perspective and implementing the same from a scheduling perspective.
- Written several Teradata BTEQ scripts for reporting purpose.
- Developed BTEQ scripts to load data from Teradata Staging area to Teradata data mart.
- Actively participated in the requirement discussion from end users, Dashboard design sessions with the data/business analysts and data model sessions with the data team.
- Created dashboards at different levels of data and enabled navigation across them using hyperlinks and report links.
- Develop UNIX scripts for handling exceptions and special requests from production support team.
- Monitoring production run.
- Working on issues if any production batch has.
- Perform enhancements on the D3A scripts to resolve production issues.
Environment: Teradata V12/V13, Teradata SQL Assistant, Teradata Utilities (BTEQ, MLoad, Fast Load, Fast Export and TPT), Informatica 8.6.1, DB2, SQL server 2005/2008, SAS, Control-M, UNIX.