We provide IT Staff Augmentation Services!

Sr. Teradata Resume

Hershey, PA


  • Progressive and innovative software professional with over 7 years of experience in Teradata in requirement understanding, system analysis, application design, development, testing, configuration management, client/ user support services and management of applications.
  • 5 years of Data Analysis experience in User Requirement Gathering, User Requirement Analysis, GAP Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
  • 5 years of Dimensional Data Modeling experience using Data modeling, Erwin 4.5/4.0, Star Join Schema/Snowflake modeling, FACT & Dimensions tables, Physical & logical data modeling.
  • Proven track record in planning, building, and managing successful large-scale Data Warehouse and decision support systems. Comfortable with both technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation and Data Staging.
  • Experience in designing and developing of Extract Transform and Load (ETL) processes on Teradata.
  • Strong Teradata along with SQL Assistant and BTEQ experience. Solid skills and expert knowledge of Teradata V2R5/V2R6/V2R12 and its utilities such as BTEQ, MLOAD, FASTLOAD, FASTEXPORT and TPUMP.
  • Good knowledge about the Teradata Architecture and Teradata Concepts.
  • Experience working on the performance tuning and optimization of Teradata SQL queries.
  • Knowledgeable/experienced in data warehousing Teradata utilities & MS Reporting.
  • Good experience in Oracle 11g/10g/9i
  • Proficient in Data Warehousing concepts, design techniques.
  • Handling all the domain and technical interaction with the client and the application users.
  • Familiarity with Win XP/2000 Advanced Server /98/95/NT, UNIX, LINUX.
  • Experience in SQL Programming (Stored Procedures, Functions and Triggers) for faster transformations and Development using Teradata and Oracle.
  • Experience in programming in C, Java and web technologies (Dreamweaver, HTML, and XML).
  • Excellent ability with competitive spirit in finding various approaches for problem solving and highly skillful, creative and innovative in developing efficient logic/code.
  • Team player with ability to adapt to various environments with excellent communication, interpersonal skills and a strong commitment towards customer satisfaction.


  • Bachelor in Computer Science and Engineering


Confidential, Hershey, PA Feb 2011 Till Date
Sr. Teradata Developer
Tradewinds, Nielsen

Project Trade Winds is an effort to transform Hershey\'s trade promotion and total sales volume planning management capabilities through building one integrated platform, offering total volume, promotional planning and "what-if" capabilities. The Master Data Management (MDM) hub built on Teradata Enterprise Data Warehouse (EDW) acts as a one integrated platform and single source of master data providing single conformed values that has been cleansed and transformed from internal and external sources. The source for Teradata (EDW) is SAPBW and target system is SIEBEL and Oracle for performing the analytics. Trade Winds will be available first for North America Business and later rolled out to Canada Business.


  • Gathering of requirements from the various systems business users.
  • Working with end users and developers to understand the data needs of an application and the larger systems
  • Providing different designs for different database uses. For example an OLTP database design will not be the same as one for a data warehouse.
  • Documenting the data involved including definitions, constraints and access needs
  • Creating and maintaining data definition language (DDL) that can be used to create and update the database.
  • Creating and maintaining logical designs, including Entity Relationship diagrams, which illustrate what data will be stored, how it is
  • Conducting meetings with the data architects to understand the source system elements.
  • Teradata Data Warehouse centric physical database modeling experience.
  • Worked on loading of data from several flat files sources to staging area using teradata
  • Created Global and volatile temporary tables to load large volumes of data into teradata database.
  • Created, updated and maintained ETL technical documentation.
  • Build tables, views, UPI, NUPI, USI and NUSI.
  • Written VBA macros with formulas and extracted Data and made Excel reports.
  • Written several Teradata BTEQ scripts to implement the business logic.
  • Interacting with SME’s and Business Analysts for clarifications.
  • Done Query optimization explain plans, collect statistics, Primary and Secondary indexes
  • Involved in physical and logical design of the applications.
  • Worked with DBA’s for transition from Development to Testing and Testing to Production.
  • Wrote several Macros, Stored Procedures.
  • Worked with SQL Assistant for ad hocking.
  • Wrote several Bteq scripts for implementing Business logic.
  • Worked on Fast load for loading the data into the teradata database.
  • Involved in working with PMON, monitoring the usage of the users.
  • Participate in Data Modeling sessions to identify data warehouse requirements
  • Involved in design, data modeling.
  • Having experience in relational database theory and design including logical and physical structures & data normalization techniques.
  • Designed and Developed logical and physical data models of systems that hold Terabytes of data.
  • Coordinated with the QA team for the derived scope of testing
  • Worked on Tuning, and troubleshooting Teradata system at various levels.
  • Excellent in Teradata Data Warehousing environment.
  • Gathering of Requirements from the various systems business users.

Environment:Teradata V2R12, Teradata Manager, SAP BW, Teradata Analyst Pack, Sun Solaris 5.1, UNIX Shell Scripting, Fast load, Fast Export, HP Quality Center, Control_M Scheduling tool.

Confidential,El Segundo, CA May 2009- Jan 2011
Solution Architect
GMIS Stores

Fresh & Easy is a chain of supermarkets on the West Coast of the United States. It is a subsidiary of the UK-based retailer Tesco, the world\'s third largest retailer. Group MIS forms part of the ‘Tesco Operating Model’ (TOM) suite of products with the aim of standardizing processes and procedures across the Group. It is a management reporting tool that consisted of a range of pre-defined ‘read-only’ reports designed for the Commercial Buying Teams to understand and manage their Key Performance Indicators (KPIs).Since then further Commercial reports have been developed and the application rolled out to more countries (Turkey, China and Japan). The intention now is to create a Retail Group MIS and Supply Chain MIS solution to add to Commercial Group MIS. The US (Fresh & Easy stores) will be the first country to receive a retail reporting system. It will be known as GMIS Stores.


  • Performed data analysis and gathered columns metadata of source systems for understanding requirement feasibility analysis.
  • Created Logical Data Model from the Source System study according to Business Requirements.
  • Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign Key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index Considerations.
  • Worked closely with Business Users and analysts to come up with BO Reports and Detailed solution approach design documents.
  • Provided initial capacity and growth forecast in terms of Space, CPU for the applications by gathering the details of volumes expected from Business.
  • Prepared high level design document for developers and participated in review / build of the BTEQ Scripts, Fastload, FastExport, Multiload and JCLs, Prepared Unit Test Plans & System Test cases.
  • Designed the code members to support re-startability and re-runnability maintaining the transaction integrity.
  • Supported and performed deployment activities.
  • Monitored health of the application in production environment.
  • Supported client with issues in production.
  • Worked on optimizing and tuning the Teradata SQLs to improve the performance of batch.
  • Analysed system performance looking into DBQL, workload manager and table stats.
  • Provided support during the system test, Product Integration Testing and User Acceptance Testing.
  • Used Endeavour for version control of the code on z/OS, Rational clear case for configuration management and rational clear quest for defect tracking.
  • Verified that the implementation is done as expected i.e. check the code members are applied in the correct locations, schedules are built as expected, dependencies are set as requested.
  • Done the impact assessment in terms of schedule changes, dependency impact, code changes for various change requests on the existing Data Warehouse applications that running in Production environment.
  • Actively participated for POC (proof of concepts) with Teradata/ Informatica/ Cognos team.
  • Prepared the Implementation Plans, Support Documents, MS Visio for data & Scheduling flow Diagrams to implement the applications in to production environment.
  • Raised Service Centre Change records to set the tasks to implementing groups, liaised between different implementation groups, monitored the implementation activities.
  • Provided fixes for any failures in Live by raising Emergency Changes.
  • Developed and designed automated solution using teradata macros, stored procedures and excel macros for business users to load reference data directly to tables.

Environment :Teradata Database 12,Erwin 4.5, Informatica PowerCenter/PowerMart 8.1, TPT, CVS, TOAD, BTEQ, SQL Assistant, Control_M Scheduling Tool, SAS, VBA ,Windows XP, UNIX MP-RAS

Confidential,Deerfield, PA Mar 2008 Apr 2009
Sr. Teradata developer

The main objective of the project is to identify metrics that will increase the customer base and revenue, maintain and improve profitability, maximize the return on marketing campaigns and improve customer service. As a part of the project, the business for 13 states is migrated onto separate Teradata and SQL server databases. The whole process is replicated on the new servers. Manually run processes are automated. Involved in Daily production support activities and production fixes.


  • Communicating with business users to understand business requirements clearly
  • Conducting meeting with the data architects to understand the source system
  • Preparation of BRD, SRS and mapping documents
  • Implement the logical to physical data model using Data modeling techniques
  • Involved in Modeling with Star Flake schema and determining fact and dimension tables and their Child Tables
  • Involved with Programming support for Teradata Relationship manager.
  • Preparation of presentations to show sides to the end users.
  • Created the reports using Business Objects functionalities like Slice and Dice, Drill Down, cross tab, Master-Detail etc.
  • Design and write custom business tools for clients utilizing Excel/VBA and SQL. Convert data, write complex macros, and perform other Excel/VBA functions for clients on a consulting basis.
  • Taking the leadership and Communicating with the offshore people to take their help to complete the deliveries in time.
  • Involved in performing tuning and query optimizations.
  • Actively participated in building a new data warehouse using SQL, OLAP and reporting through Business Objects and Cognos.
  • Conducting peer reviews and code review meetings.
  • Debugging SAS and Teradata Macros to improve report generation
  • Created user views for other dependent and data validation teams.
  • Created the Primary index, secondary indexes and join indexes.
  • Used the EXPLAIN facility for Performance Tuning.
  • Creation of users, profiles, Roles to handle the data access requests comes from outside of DBA team.
  • Preparation of Free Space CPU and I/O usage statistics reports.
  • Preparation of data skew reports and scheduling to send these stats to all the dependent team of an EDW.
  • Involved in unit and integration testing.
  • Reviewing of Unit and Integration test case results.
  • Maintaining the production code.
  • Preparation of production supporting documents.
  • Providing Post Production Support.
  • Scheduling the Teradata jobs using the UNIX wrappers.
  • Generate weekly, monthly reports for internal and external database and system audit.
Environment:Windows XP, UNIX, UNIX Wrappers ,VBA,Teradata V2R6.2,Erwin 4.1, Teradata SQL Assistant, BTEQ, FLOAD, FEXPORT, MLOAD, Oracle 9i,Business Objects, SQL Server 2000, DB, SFTP.

Confidential,Jersey City, NJ Jan 2007- Feb 2008 Teradata developer

The Development of EDW is the strategic phase on the Swiss Banking IT Platform (SBIP) for all reporting and analytic applications over historical and end of-day data.
Data to be analyzed are obtained from various data sources, integrated and historized. Data obtained from the source systems are loaded into so-called subject databases. As the name suggests, each such database collects data about one specific subject (such as payments or credits).
Source files which are mainly generated from Mainframe based system, are loaded into the Data Warehouse on a daily, weekly. The source files contain the business data related to different segments like PPP, ACT, REF, MKT, SWC, and CAR etc. Migrated data is used for different types of reporting and analysis purposes.

  • Experienced in Designing and modeling of data.
  • Involved in converting the conceptual data to logical data and then to Physical data model.
  • Experienced in Creating the Data bases and users
  • Space estimations and User maintenance on dev machine
  • Worked with Teradata and SAS for integrity and Performance
  • Designing the ETLs and conducting review meets
  • Involvement in implementation of BTEQ and Bulk load jobs
  • Coding using BTEQ SQL of TERADATA, write Unix scripts to validate, format and
  • execute the SQL’s on UNIX environment. Performance tuning the long running queries.
  • Developed processes on both Teradata and Oracle using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, BTEQ (Teradata) and SQL*Plus, SQL*Loader.
  • Reduced Teradata space used by optimizing tables adding compression where
  • Appropriate and ensuring optimum column definitions.
  • Monitoring ETL jobs until production jobs are stabilized.
  • Tuning and optimization of Teradata Queries.
  • Using the UNIX wrappers to make the workflows.

Environment: Teradata V2R5, Teradata Queryman, Informatica V 7.1.2, IBM Mainframes, Oracle 8i, UNIX Shell Scripts, Erwin4.0, connect–direct,SAS, Utility: Teradata Loading Utilities (Bteq, Multiload, Fastload, FastExport, Tpump).

Confidential, Chicago, IL Oct 2005 - Dec 2006
Sr. Teradata Developer

Zurich Financial Services is an insurance-based financial services provider with a global network that focuses its activities on its key markets in North America and Europe. In North America, Zurich is a leading commercial property-casualty insurance provider serving the global corporate, large corporate, middle market, small business, specialties and programs sectors. The company\'s General Insurance segment offers commercial and personal property/casualty and specialty coverage, while its Life Insurance segment offers life insurance, annuities, and other investment policies.


  • Responsible for delivering project timeline associated with the developer deliverables.
  • Data loaded into Teradata from flat files using data stage dynamic sequence job with schema file concept.
  • Developed various loading processes like scrap re-allocation, reject activities, milestone tally, Foundry Source Organisations etc, in the project using various data stage stages like stage, sort stage, aggregate stage etc.
  • Extensively used data stage RDS, Merge stage, lookup
  • and created mappings using stages like transformer stage, lookup stage, aggregate stage, join stage, merge stage, sort stage, funnel stage, remove duplicate stage, surrogate key generator stage, filter stage, change capture stage etc.,
  • Responsible to gather requirements from business users and format them according to the business needs, model the requirements to match the current architecture and assign it to the team
  • Optimised high volume tables (Including collection tables) in Teradata using various join index techniques, secondary indexes, join strategies and hash distribution methods.
  • Automated oracle metadata management consists of wafer configurations, parameters, tools, bins, parameter aliasing and MDE (Manual data entry) using data stage run time propagation method and on the fly Multiload script generation using DBC information.
  • Gained extensive knowledge in parallel jobs, Server Jobs and Job sequence. Various job sequence properties used to line up parallel jobs, database commands, operating system commands such as exec command activity, nested condition activity, loop activity etc
  • Designed, modelled, developed and optimised complex work-in-progress (WIP) summary collections such as Summary throughput equipment, Summary throughput lot recipes etc.,

. Environment: Windows, UNIX, Vi Editor , Oracle 9i, Teradata V2R6, Teradata SQL Assistant, BTEQ, FLOAD, FEXP, MLOAD, SFTP, PMON, TD ADMINSTRATOR, Teradata manager, Backbone’s Netvault 7.0, ARC, DBSControl, DBQL, Teradata Analyst pack, QCD

Hire Now