We provide IT Staff Augmentation Services!

Sr. Teradata Developer Resume

Charlotte, NC

SUMMARY:

  • Over 7+ Years of experience in Information Technology, with focus on Data Warehousing and Application development.
  • Experience in Teradata 15/14/13/12 (Teradata SQL Assistant, BTEQ, Fast Load, Multi Load, TPump, TPT (Teradata Parallel Transporter), Fast Export, Visual Explain).
  • Worked with various versions of Informatica like Informatica 10.X/9.X/8.X
  • Good experience on Mainframes (MVS) Job Control Language (JCL).
  • Worked in several areas of Data Warehouse including Analysis, Requirement Gathering, Design, Development, Testing, and Implementation.
  • Experience in Core Java SQL and stored procedures, Spring JMS/messaging XML based processing Java 1.6 Multi - threading Design Patterns.
  • Experienced with mentoring Teradata Development teams, data modeling, program code development, test plan development, datasets creation, testing and result documentation, analyzing defects, bug fixing.
  • Experienced in Software Development and gained expertise in Data Warehousing and Business Intelligence.
  • Experience with large scale data manipulation, management and analysis with large data sets on platforms such as HortonWorks, Cloudera, Aster or Vertica.
  • Extensively used Teradata application utilities like BTEQ, Multi Load, Fast Load, TPUMP and TPT.
  • Extensively worked on Mainframe/Unix and Informatica environments to invoke Teradata Utilities and file handlings
  • Good exposure to Insurance and Banking Domains.
  • Good knowledge of Dimensional Data Modeling, Star Schema, Snow-Flake schema, FACT and Dimensions Tables.
  • Worked in remediation (Performance Tuning) team for improving query performance of user queries and production SQL’s.
  • Very good experience in customer facing skills and release and change management.
  • Proficient in Teradata Database Design (Physical and Logical), Application Support and setting up the Test and Development environments.
  • Involved in Data Migration between Teradata and DB2 (Platinum).
  • Strong Data warehousing experience specialized in Teradata, ETL Concepts, Strategy, Design, Architecture, Development and Testing. Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses using Teradata load utilities.
  • Extensive experience in Developing Informatica complex mappings/Mapplets using various Transformations for Extraction, Transformation and Loading of data from multiple sources of data warehouses and creating workflows with work lets & tasks and scheduling using workflow manager.
  • Good exposure to Teradata Manager, FASTLOAD, MULTILOAD, TSET, TPUMP, SQL, PDCR and ARCMAIN, TASM for workload management.
  • Expertise in using Teradata SQL Assistant, Teradata Manager and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export and exposure to Tpump on UNIX environment.
  • Experience in Dimensional Modelling, ER Modeling, Star Schema / Snowflake Schema/3NF, FACT and Dimensional tables and Operational Data Store (ODS)
  • Exposure to Data Mapping/Modelling for data warehousing and Acquisition/Conversion related tasks.
  • Have a strong exp in Teradata development and index’s (PI, SI, PARTITION, JOIN INDEX) etc.
  • Worked on Data Mining techniques for identifying claims on historical data.
  • Experienced in development/support of various data warehousing applications in communication, health services, insurance, banking and Financial industries.
  • Proficient in performance tuning from design architecture to application development.
  • Architected and implemented user authentication via LDAP in multiple security enclaves to support users, hosts, and access control.
  • Expertise in database programming like writing Stored Procedures (SQL), Functions, Triggers, Views in Teradata, DB2 & MS Access.
  • Exposure in extracting, transforming and loading (ETL) data from other databases and text files using SQLServer ETL tools.
  • Played various roles on Projects that required Data warehouse consulting which include Data warehouse ETL Architect, Program Analyst, Technical Lead, Project Lead and Managerial roles.
  • Experience in writing complex SQL to implement business rules, extract and analyze data

TECHNICAL SKILLS:

Teradata Utilities: BTEQ, Fast Load, Multi Load, TPump, Mainframes, Fast Export, Query man(SQL assistant), TPT

Databases: Teradata 15, Teradata 14, Teradata 13, Teradata 12, Oracle, SQL Server

ETL Tools: Informatica Power Center 8.X/9.X/10.X

Languages: Job Control Language (JCL), Java, SQL

Other tools: File-Aid, Endeavour, TSO, Perigreen Service Center, Exceed 8.0, Redbox, Expeditor, Hummingbird BI Query, Unix, Korn Shell, Maestro (Job Scheduling Console), Tivoli Work Scheduler, One Automation, Jiira, HP ALM, Coin

PROFESSIONAL EXPERIENCE:

Confidential, Charlotte, NC

Sr. Teradata Developer

Responsibilities:

  • Analyzing the business requirements specified by the client
  • Preparation of Technical requirement, High level and Low-level design documents based on the business process document.
  • Involves in loading data from flat files to ware house landing zone tables using Teradata Utilities.
  • Writing BTEQ Scripts to load data from landing zone to Work tables in staging area based on the ETL mapping documentation.
  • Designed complex UNIX scripts and automated it to run the workflows daily, weekly and Monthly
  • Unit scripting to invoke the Teradata Bteq scripts.
  • Creation of views in Target database to calculate measures from different schemas.
  • Performance tuning of BTEQ scripts.
  • Scheduling of jobs in One automation based on daily or intra-day frequency.
  • Extensively works in data Extraction, Transformation and Loading from source to target system using Informatica and Teradata utilities like fast export, fast load, multi load, TPT.
  • Developed complex mappings to load data from Source System (Oracle) and flat files to Teradata.
  • Works with Teradata utilities like BTEQ, Fast Load and Multi Load.
  • Experience in logical and physical data models using Data Modeling tools Erwin and ER Studio.
  • Working Knowledge of Data warehousing concepts like Star Schema and SnowflakeSchema, Data Marts, Kimball Methodology used In Relational, Dimensional and Multidimensional data modeling.
  • Establishes application hierarchies, databases, users, roles and profiles as per requirements.
  • Responsible for performance monitoring, resource and priority management, space management, user management, index management, access control, execute disaster recovery procedures.
  • Created Teradata physical models using ER Studio by identifying right PI, PPI and other indexes. Created both base later and semantic layers. Worked with enterprise Data Modeling team on creation of Logical models.
  • Created scripts using FastLoad, Multi-Load to load data into the Teradata data warehouse.
  • Involved in SQL scripts Macros, stored procedures in Teradata to implement business rules.
  • Updated numerous BTEQ/SQL scripts, making appropriate DDL changes and completed unit test.
  • Worked on Teradata SQL Assistant querying the target tables to validate the BTEQ scripts.
  • Used Informatica PowerCenter Workflow manager to create sessions and workflows to run the logic embedded in the mappings.
  • Wrote UNIX scripts to run and schedule workflows for the daily runs.

Environment: Teradata 15.10, Unix (Shell scripting), Teradata - Bteq, Fast load, Multi load, Fastexport, Viewpoint, Informatica power center 10.1/10.2, Scheduling tool- One Automation.

Confidential, Cincinnati, OH

Sr. Teradata Developer

Responsibilities:

  • Extensively worked in data Extraction, Transformation and Loading from source to target system using Informatica and Teradata utilities like fast export, fast load, multi load, TPT.f
  • Developed complex mappings to load data from Source System (Oracle) and flat files to Teradata.
  • Worked with Teradata utilities like BTEQ , Fast Load and Multi Load .
  • Experience in logical and physical data models using Data Modeling tools Erwin and ER Studio.
  • Working Knowledge of Data warehousing concepts like Star Schema and SnowflakeSchema, Data Marts, Kimball Methodology used In Relational, Dimensional and Multidimensional data modelling.
  • Established application hierarchies, databases, users, roles and profiles as per requirements.
  • Responsible for performance monitoring, resource and priority management, space management, user management, index management, access control, execute disaster recovery procedures.
  • Created Teradata physical models using ER Studio by identifying right PI, PPI and other indexes. Created both base later and semantic layers. Worked with enterprise Data Modeling team on creation of Logical models.
  • Created scripts using FastLoad, Multi-Load to load data into the Teradata data warehouse.
  • Involved in Sql scripts Macros, stored procedures in Teradata to implement business rules.
  • Updated numerous BTEQ/Sql scripts, making appropriate DDL changes and completed unit test.
  • Worked on Teradata SQL Assistant querying the target tables to validate the BTEQ scripts.
  • Used Informatica Power Center Workflow manager to create sessions and workflows to run the logic embedded in the mappings.
  • Wrote UNIX scripts to run and schedule workflows for the daily runs.
  • Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and used them in the job.
  • As a developer, will be working on multi-tier Java and J2EE based applications, responsible for writing business logic using core Java and JavaBeans, SQL queries for the backend RDBMS, and user frontend using JSP, along with JavaScript libraries, Ajax and HTML.
  • After setting up Java files on the CMS, download them to the computer and then upload them via a File Transfer Protocol (FTP) to web host account.
  • Created different transformations like Source Qualifier, Expression, Filter, Lookup transformations for loading the data into the targets.
  • Wrote complex SQL scripts to avoid Informatica joiners and look-ups to improve the performance as the volume of the data was heavy.
  • Wrote scripts to ensure e-mails sent to respective people on Failure/Success, error handling, control/auditing of different processes.
  • Automated the Informatica process to update a status table in Teradata when maps are run successfully, following this a view will be run.
  • Architected and implemented user authentication via LDAP in multiple security enclaves to support users, hosts, and access control.
  • Have DBA skills in Teradata DBMS Administration, initial environment setup, development and production DBA support, Teradata Manager, Viewpoint.
  • Involved in the analysis and optimization of long running jobs
  • Involved in interaction with client resources to discuss tasks and issues
  • Timely escalation of issues to avoid delay in deliverables
  • Development of ETL Mappings, Workflows and prepare them for deployment into test and production environments
  • Participate in code reviews and ensure that all solutions are aligned to pre-defined architectural specifications
  • Involved in UAT and Production support and implementation
  • Analysis of the Unix and SQL scripts to rate them in terms of complexity
  • Working on Teradata view point to check status of the system health.
  • Opening session management contents like Query monitor and query spotlight to check for the performance of individual queries before implementing into production keeping in mind for future issues.
  • Perform tuning activity for the highly skewed queries.
  • Mainly handle the spool space request from the end users.
  • Creating join indexes as per architectural strategies and standards also creating PPI’s on large tables to improve performance of the queries.

Environment: Teradata 14, Unix, Korn Shell, Teradata - Bteq, Fast load, Multi load, Fastexport, Viewpoint.

Confidential, Berkley Heights, NJ

Sr. Teradata Developer

Responsibilities:

  • Involved with business requirements, Technical requirements, and Design documents and coordinated with Data analyst team on the requirements.
  • Work in coordination with DBA Team on remediation activities. Create users, databases and roles.
  • Allocate database space and monitor queries and system performance using Viewpoint and PMON.
  • Developing complex sql’s and bteq scripts for processing the data as per coding standards.
  • Invoking shell scripts to execute bteq, fast load, multi load utilities.
  • Invoking Korn shell scripts to do reconciliation checks and passing parameter files.
  • Used the Core Java for the basic programming of the modules.
  • Used core Java Confidential o build new functionality based on requirements.
  • Developed few modules using Java/ J2EE technologies, Ext JS and Oracle.
  • Worked on conceptual/logical/physical data model level using ERWIN according to requirements.
  • Interacted with various Business users in gathering requirements to build the data models and schemas.
  • Extensively worked on Mainframe/Unix and Informatica environments to invoke Teradata Utilities and file handlings.
  • Expertise with Job scheduling console (Tivoli) on GUI and MVS.
  • Developed and review the code, support QA and performed Unit and UAT testing.
  • Implemented changes in coordination with Infrastructure team and Provide Warranty Support.
  • Troubleshooting any issues related to production, database development and documenting issues.
  • Working on improvement of existing process by integrating various sources and helping in maintenance code hence ensuring process is efficient.
  • Extracted data from various sources like Oracle, DB2, and SQL server and loaded into Teradata.
  • Review of statistics and joins for performance improvement of Teradata SQL's using DBQL, Explain Plans
  • Extracted data from various sources like Oracle, DB2, and SQL server and loaded into Teradata.
  • Making changes to Physical model to assist with performance improvement by implementing partitioning, compressions and indexes on EDW tables.
  • Perform tuning activity for the highly skewed queries.
  • Monitoring the View Point & PMON to check the system impact queries.
  • Mainly handle the spool space request from the end users.

Environment: Teradata 13, Windows, Korn Shell, Teradata - Bteq, Fast load, Multi load, Fastexport, Oracle, Informatica Power Center 9.6, SQL server, Aqua data studio, Toad.

Confidential, Alpharetta, GA

TeraData Developer

Responsibilities:

  • Involved with business requirements, Technical requirements, and Design documents and coordinated with Data analyst team on the requirements.
  • Managing database space, allocating new space to database, moving space between databases as need basis.
  • Creating roles and profiles as needed basis. Granting privileges to roles, adding users to roles based on requirements.
  • Extensively worked with DBQL data to identify high usage tables and columns. Redesigning the Logical Data Models, and Physical Data Models.
  • Worked on ETL Informatica Power Center 9.5 for creating mappings/workflows and also invoking bteq’s via workflows.
  • Developing complex sql’s and bteq scripts for processing the data as per coding standards.
  • Invoking shell scripts to execute bteq, fast load, multi load, Tpump utilities.
  • Invoking Korn shell scripts to do reconciliation checks and passing parameter files.
  • Extensively worked on Mainframe/Unix and Informatica environments to invoke Teradata Utilities and file handlings.
  • Expertise with Job scheduling console (Tivoli) on GUI and MVS.
  • Developed the OTC Product Master System using Java /J2EE technologies and Oracle.
  • Developed a custom rules engine to execute rules formulated by various DCO's which also supports dynamic rule updates. This is achieved by Java reflection and loading the rules in DB.
  • Expertise in understanding data modeling involving logical and physical data model.
  • Involved in creating data models using Erwin.
  • Involved in the development of the new system RDH 2.0 using latest Java /J2ee technologies.
  • Worked on POC for using TPT utility in place of regular utilities to compare the performance with existing process.
  • Working on design for developing process to update the existing applications to Integrate TPT for processing heterogeneous files with new operators named as Load, Update, Stream and Export respectively along with other operators like Selector, Inserter, Data connector, and ODBC.
  • Working in a hybrid environment which is built in combination of 3NF and dimension model
  • Developed and review the code, support QA and performed Unit and UAT testing.
  • Coordinate with offshore team on Development activities
  • Implemented changes in coordination with Infrastructure team and Provide Warranty Support.
  • Worked as Production support team for production issues with Space forecast on the system
  • Monitoring bad queries, aborting bad queries using PMON, looking for blocked sessions and working with development teams to resolve blocked sessions.
  • Troubleshooting any issues related to production, database development and documenting issues.
  • Working on improvement of existing process by integrating various sources and helping in maintenance code hence ensuring process is efficient.
  • Review of statistics and joins for performance improvement of Teradata SQL's using DBQL, Explain Plans
  • Making changes to Physical model to assist with performance improvement by implementing partitioning, compressions and indexes on EDW tables.

Environment: Teradata 12, Informatica Power Centre 9.5, Unix, Korn Shell, Teradata - Bteq, Fast load, Multi load, Fastexport, TPUMP, Oracle, Mainframesa, Sql server, Aqua data studio, Toad.

Confidential

Teradata Developer

Responsibilities:

  • Involved in writing complex SQL queries to pull the required information from Database using Teradata SQL Assistance.
  • Involved in the Data warehouse data modeling based on the client requirements.
  • Developed Logical and physical database designs for the transaction system.
  • Responsible for configuring the Workflow manager, Repository Server Administration Console, Power Center Server, Database Connection.
  • Worked with DBAs to create a best fit physical data model from the logical data model.
  • Worked on conceptual/logical/physical data model level using ER Studio/ERWIN according to requirements.
  • Performance tuning on sources, targets, mappings and SQL queries in the transformations.
  • Created reusable transformations and mapplets based on the business rules to ease the development.
  • Used various Informatica Error handling technique to debug failed session.
  • Generated Java files using Castor for marshalling/un-marshalling of data using Castor, XML and XSD and making calls to billing system's middleware APIs.
  • Involved in developing the application in Java with Struts framework using CE and PE API's.
  • Development of scripts for loading the data into the base tables in using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Done various optimization techniques in Aggregator, Lookup, and Joiner transformation.
  • Developed mapping to implement type 2 slowly changing dimensions.
  • Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.
  • Used various debugging techniques to debug the mappings.
  • Created Materialized view for summary data to improve the query performance.
  • Responsible for scheduling the workflow based on the nightly load.
  • Supported Oracle 8i databases running mission critical 24*7 systems.

Environment: Teradata SQL Assistant, Informatica - Power Center 9.X, Oracle 8i, Erwin Data Modeler, BTEQ, FastLoad, Unix, Multiload, TOAD, Windows XP

Confidential

Teradata Developer

Responsibilities:

  • Impact analysis and Estimation preparation based on the business requirement.
  • Preparing Design/Small Change definition documents for any new process or small changes that are required to be applied to the Data Warehouse.
  • Design review meeting to get approval on the changes.
  • Mostly worked on Dimensional Data Modelling, Star Schema and Snowflake schema modelling.
  • Code development as per the requirement and document using Teradata utilities like Fast Load, Multi Load, Fast Export and BTEQ SQL.
  • ETL tool Informatica was used to load strategic source data to build the data marts.
  • Unit Test, System Test & UAT of code developed.
  • Involved in data migration projects from DB2 to Teradata.
  • Documentation of the changes as per the DWS working practice.
  • Worked on changes to existing website to address scalability issues in Java.
  • Presenting the changes to DBA team and support team for handover.
  • Release meetings with Client’s CAB (Change Approval Board) to detail changes going into a release and get the approval for implementing changes.
  • Implementation & Storm Support.

Environment: Teradata 12, IBM Mainframe, UNIX, Informatica power center 8.X, Teradata, DB2(Platinum), Humming Bird BI Query.

Hire Now