We provide IT Staff Augmentation Services!

Teradata/etl Developer. Resume

4.00/5 (Submit Your Rating)

FL

SUMMARY:

  • 12 years of total IT experience
  • 11+ years as Teradata/ETL Developer
  • SDLC: 10+ years of experience in analyzing business needs of clients, developing effective and efficient solutions and ensuring client deliverable within committed deadlines. Diverse industry experience, including Telecom, Retail, Financial and Manufacturing
  • Teradata: 10 years of Teradata 15.10,14,13/V12/V2R6/V2R5/V2R4.1/V2R3, Teradata SQL Assistant (Queryman). Experience on data load/unload utilities like BTEQ (Import/Export), FastLoad, MultiLoad, T - Pump, Fast Export. Worked on Teradata administrative utilities like Teradata Manager, Teradata Administrator, Teradata Visual Explain and Teradata Index wizard. OLAP, OLTP, ETL, BI, Sync Sort for NCR 4300/5200. Proficient in Teradata V2R5 database design (Logical and Physical), Query Optimization, SQL Performance Tuning, PLSQL and dynamic SQL.
  • Have extensive knowledge in BI Technologies like Tableau, Cognos.
  • Teradata Skills: BTEQ, Fast Load, Multiload, TPT, TPump, SQL Assistant, Viewpoint, Query Monitor.
  • 4 years of Dimensional Data modeling experience on Data modeling, Erwin 4.5/4.0/3.5.5/3.5.2, Dimensional Modeling, Datamarts, OLAP, FACT & Dimensions tables.
  • 10 years of experience using Oracle 11g/10g/9i/8i/8.0/7.0, DB 2 8.0/7.0/6.0, MS SQL Server 2005/2000/7.0/6.0, Teradata V2R5/V2R4/V2R3,DB2 for OS/390, 12.x/11.x, MS Access 7.0/2000,SQR 3.0, Erwin 3.5/3.x, SQL, XML, PL/SQL, SQL*Plus, SQL*Loader and Developer 2000 and SQL Server 12.0/11.x, MS SQL Server 2000/8.0/7.0/6.5, MS Access 97/2000.

TECHNICAL SKILLS:

Teradata: Teradata 15.10,14,13,V2R5/V2R4.1/V2R3, Data Load/Unload utilities like BTEQ, FastLoad, MultiLoad, Tpump, Fast Export and Teradata administrative utilities like Archive/Restore, Table Rebuild, Check Table, Configuration, Reconfiguration, Filer, DIP., ), OLAP, OLTP, ETL, BI, Sync sort for UNIX/Mainframes, NCR 4300/5200.

Dimensional Data Modeling: Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, ERWIN 4.5/4.0/3.5/3.2, Oracle Designer, synopsis, Visio and Sybase Power Designer.

Databases: Oracle 10g/9i/8i/8.0/7.x, IBM DB2 UDB 8.0/7.2/7.0, MS SQL Server 2005/ 2000/7.0/6.0, Teradata V2R12/V2R5/V2R4/V2R3, EDI, SAP R/3, Sybase 12.x/11.x, MS Access

Programming GUI: SQL, PL/SQL, SQL Plus, Transact SQL, ANSI SQL, Synopsis, Visual Basic

Environment: HP-UX 10.20/9.0, IBM Mainframes AIX 4.2/4.3, MS DOS 6.22, Novell NetWare 4.11/3.61, Win 3.x/95/98, Win NT, Red Hat Linux, Sun-Ultra, Sun-Spark, Sun Classic, SCO Unix, HP9000, RS6000, Win 3.x/95/98, Win 2000, Win XP. MS-Dos

PROFESSIONAL EXPERIENCE:

Confidential, FL

Teradata/ETL Developer.

Responsibilities:

  • Understanding the business requirements, developing design specifications for enterprise applications using Teradata.
  • Reviewed mapping documents provided by Business Team, implemented business logic embedded in mapping documents into Teradata SQLs and loading tables needed for Data Validation.
  • Writing BTEQ scripts for validation & testing of the sessions, data integrity between source and target databases and for report generation.
  • Helping the Reporting team by providing the Teradata SQL queries and loading data to tables.
  • Created Mapping tables for Data load and Complex Report views for Business Objects Universes and Dashboard reports.
  • Teradata SP/View/BTEQ development and involving in the code review meetings.
  • Supporting The Application(Revenue Recognition).
  • Created BTEQ scripts to extract data from warehouse for downstream applications.
  • Generated various Reports and Extracts for analysis purposes.
  • Used UNIX shell scripts for automating tasks for BTEQ,TPT and other utilities.
  • Performance tuning the long running queries.
  • Incorporated various transformations like Filter, Sorter& labler in IDQ environment.
  • Responsible for migrating data from developer tool to Informatica power center.
  • Preparing production deployment game plan, back out plan, test scripts.
  • Worked with DBAs to tune the performance of the applications and Backups.
  • Monitoring load job’s and issues.
  • Supporting production deployment and patching.
  • Testing support including SIT/UAT. Issue tracking and resolution.
  • Worked in resolving production issues providing apt solutions and SLA’s.
  • Used Service Request to log in and track the tickets/ service requests.

Environment: Teradata V15.10, Teradata SQL ASST, Utility: Teradata Loading Utilities (Bteq, Multiload, Fastload, Fast Export), Unix HP.

Confidential (Chicago,IL)

Teradata/ETL Developer.

Responsibilities:

  • Collaborate with business analysts for requirement gathering, business analysis and designing of enterprise data warehouse.
  • Participates in different JAD session and meetings to study the source system data and understand the business to know how the data related to sales representatives, billing system is linked between the different HSBC source systems. This is very important to identify the right sales person who is eligible for compensation.
  • Drop, Alter the table either in CST database or SPM database and also Export the data from table to flat file or csv files with required delimiter, Import data from flat file or csv file to Teradata table. Perform the validations between source to staging and staging to target tables with different scenarios for example at CIN, HUB CIN level in Fact tables or Dimension tables. Purpose of this task is to make sure the billing revenue shown on the reports is in line with the business expectations.
  • Determine UPI, NUPI, USI, and NUSI for data application layer tables as part of tuning the high CPU impact queries.
  • Analyzes the data and writes the hashing algorithm to define suitable index (Unique Primary Index or Non Unique Primary Index or Unique Secondary Index or Non Unique Secondary Index) on each table so that right path is selected to access the data. It is important to define such indexes so that the overloading/burden on the server can be avoided.
  • Write SQL queries using correlated sub queries, joins and recursive queries.
  • Writes different SQL statements using Joins, functions, Clauses, Conditions etc. to load the data from CST staging Compensation table’s to SPM target Compensation table’s by doing look up on existing Customer, Account, Billing tables to implement Slowly Changing Dimension Type 1, 2, 3 .This is important to know the latest bill generated for each customer based on whether account closed or opened.
  • Extracts some source data from other relation sources like Oracle, Sql Server etc. using ETL tools either Informatica with different transformations like Filter,Router,Expression,Look up etc. or Datastage stages like Filter,Joiner,Transformer etc. to load into CST database. Main purpose is to generate more efficient and accurate reports related sales representatives performance.
  • Use Teradata SQL Assistant to build/test the SQL queries.
  • Defines ODBC connection to the SPM Dev, SPM Uat and SPM Prod server for writing Different SQL statements to Insert, Update, Create Tables.
  • Extensively use Teradata utilities BTEQ,Fast load/Multiload/Tpump/Fast Export/TPT (SQL Insertor, Load, Update, Stream, Export and ODBC) operators to load/export data into/from database objects.

Confidential, FL(Tampa)

Teradata/ETL Developer.

Responsibilities:

  • Communication with business users and analysts on business requirements.
  • Worked in complete Software Development Lifecycle Experience (SDLC) from Requirement gathering to Development, Testing, Deployment and Documentation.
  • Fulfilled ad-hoc requests coming from superiors.
  • Working on SQL’s to get required data.
  • Helping the Reporting team by providing the Teradata SQL queries.
  • Created Mapping tables for Data load and Complex Report views for Business Objects Universes and Dashboard reports.
  • Involvement in implementation of BTEQ and Bulk load jobs.
  • Worked on PMON,Manager.
  • Created BTEQ scripts to extract data from warehouse for downstream applications.
  • Exporting Data using BTEQ and Fast Export.
  • Execute the SQL’s on UNIX environment.
  • Performance tuning the long running queries.
  • Incorporated various transformations like Filter, Sorter, labler and Address doctor in IDQ environment.
  • Responsible for migrating data from developer tool to Informatica power center.
  • Developed processes on Teradata using RDBMS utilities such as Fast Load, FastExport, BTEQ Teradata).
  • Worked with DBAs to tune the performance of the applications and Backups.
  • Participating in development of Hadoop Data Lake Environment and data ingestion, using Scoop, Impala and Hive. Experience on using Python and HDFS commands. Worked on Scoop and complex batch and ETL data pipe lines etc. Experience on data load to Hive and Impala tables.
  • Validating data in Hue and using toad connecting to Data lake structures.
  • Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Developed Simple to complex Map/reduce Jobs using Hive,Sqoop and Pig
  • Involved with Data Steward Team for designing, documenting and configuring Informatica Data Quality environment for management of data.
  • Documented the issues and actions taken related to data cleansing using cleanse lists and predefined cleanse functions.
  • Responsible for E2E integration of data from IDQ - informatica environment to EDW.
  • Tuning and optimization of Teradata Queries.
  • Worked on UDF for masking data.
  • Code review’s in production issue’s.
  • Monitoring load job’s and issues.

Environment: Teradata V14.10, Teradata SQL ASST, Utility: Teradata Loading Utilities (Bteq, Multiload, Fastload, Fast Export), Hadoop 1.1.2, Hive 0.9.0, PIG 0.11.1, Sqoop-1.4, unix hp.

Confidential, VA

Teradata Developer

Responsibilities:

  • Work on SQLs to get required data.
  • Help the reporting team by providing the Teradata SQL queries.
  • Involve in implementation of BTEQ and Bulk load jobs.
  • Create BTEQ scripts to extract data from warehouse for downstream applications.
  • Export data using BTEQ and FastExport.
  • Execute the SQLs on Unix environment.
  • Develop processes on Teradata using RDBMS utilities such as FastLoad, FastExport and BTEQ Teradata).
  • Troubleshoot the failure Jobs.
  • Develop MLoad scripts to load data from Load Ready Files to Teradata Warehouse.
  • Solved various defects in set of wrapper scripts which executed the Teradata BTEQ, MLoad and FLoad utilities.
  • Wrote Teradata SQL queries according to process needs.
  • Used Visual Explain for query optimization.

Environment: Teradata V2R13, Teradata QueryMan, Oracle 8i, Utility: Teradata Loading Utilities (BTEQ, MultiLoad, FastLoad and Fast Export)

Confidential, Phoenix, AZ

Teradata/ETL Developer

Responsibilities:

  • Worked on loading of data from several flat files sources to Staging using TeradataMLOAD, FLOAD.
  • Hands on experience in using code repositoryCVS, for code check in and checkout process.
  • Involved in business requirements, technical requirements, high-level design, and detailed design process.
  • Performedunit and system testfor the modified code and loaded shadow data marts for testing prior to production implementation
  • Worked on Store ProceduresSQL tuningand improved thePerformanceof the Queries.
  • Involved in maintainingclean,accurate and consistent ( Confidential CAS - Credit Card Authorization System) datato share across systems enterprise wide using Teradata MDM (Master Data Management) methodology.
  • Acted as the lead developer for all the ETL jobs, reading data from the vendors and affiliates, and finally loading dimension, fact and other aggregate tables.
  • Designed and developed using various transformations like Filter, Sorter, Parser and Address doctor in IDQ environment.
  • Responsible for tables creation.
  • Analyze profiling results and make recommendations for improvements.
  • Build a re-usable staging area in Oracle for loading data from multiple source systems using template tables for profiling and cleansing in IDQ or QualityStage.
  • Worked with the users and testing teams to implement the business logic as expected.
  • Written several Teradata BTEQ scripts to implement the business logic.
  • Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.
  • Generate power point reports usingVBA macrosandTeradata macros.
  • Performance tuning and optimization of database configuration and application SQL.
  • Tested the stored procedures and custom tools for teradata version upgrade from 14 to 14.10.

Environment: Teradata 14, NCR’s Unix Servers 50, Oracle, MVS, SAS 9.1, DB2, VBA Macros, Ab Initio, BTEQ, MLOAD, FLOAD, Affinium, ERWIN.

Confidential

Teradata/ETL Developer

Responsibilities:

  • Understood the business needs and implemented the same into a functional database design.
  • Worked closely with Project Managers, Business Analysts, BI Architect, source system owners, Data Management/Data Quality team to ensure timely and accurate delivery of business requirements.
  • Coding using BTEQ SQL of TERADATA, write UNIX scripts to validate, format and execute the SQL’s on UNIX environment.
  • Working with vendors to provide required data.
  • Create BTEQ scripts to extract data from warehouse for downstream applications.
  • Participated in Meetings of Designing Source Matrix Preparation for the System for both Dealers vendors and subsequent strategy changes in those systems.
  • Export data using BTEQ and Fast Export.
  • Used Teradata SQL Assistant extensively to work with Teradata Database.
  • Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Develop processes on Teradata using RDBMS utilities such as FastLoad, FastExport and BTEQ Teradata).
  • Versed with proactively addressing DBMS performance issues such as partition optimization, indexes, and statistics collection.
  • Troubleshoot the failure Jobs.
  • Developed SHELL scripts and PERL programs for getting the data from all systems to load into Data Warehousing system. The data was standardized to store various Business Units in tables.
  • Performance Tuning of sources, Targets and SQL queries in transformations.
  • Used Visual Explain for query optimization.
  • Developed unit test plans and involved in system testing.

Environment: Teradata 14, Teradata QueryMan, Utility: Teradata Loading Utilities (BTEQ, MultiLoad, FastLoad and Fast Export),Unix.

Confidential, VA

Teradata Developer

Responsibilities:

  • Work on SQLs to get required data.
  • Help the reporting team by providing the Teradata SQL queries.
  • Involve in implementation of BTEQ and Bulk load jobs.
  • Create BTEQ scripts to extract data from warehouse for downstream applications.
  • Export data using BTEQ and FastExport.
  • Execute the SQLs on Unix environment.
  • Develop processes on Teradata using RDBMS utilities such as FastLoad, FastExport and BTEQ Teradata).
  • Troubleshoot the failure Jobs.
  • Develop MLoad scripts to load data from Load Ready Files to Teradata Warehouse.
  • Solved various defects in set of wrapper scripts which executed the Teradata BTEQ, MLoad and FLoad utilities.
  • Wrote Teradata SQL queries according to process needs.
  • Used Visual Explain for query optimization.

Environment: Teradata V2R13, Teradata QueryMan, Oracle 8i, Utility: Teradata Loading Utilities (BTEQ, MultiLoad, FastLoad and Fast Export)

Confidential, Richmond VA

Teradata/Informatica Developer

Responsibilities:

  • Communication with business users and analysts on business requirements.
  • Creating Scope doc’s and coming up with Technical design.
  • Working on SQL’s to get required data.
  • Using Informatica to load data from source to staging.
  • Helping the Reporting team by providing the Teradata SQL queries.
  • Working on Claims and s Subject Areas.
  • Involvement in implementation of BTEQ and Bulk load jobs.
  • Created BTEQ scripts to extract data from warehouse for downstream applications.
  • Exporting Data using BTEQ and Fast Export.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator
  • Execute the SQL’s on UNIX environment.
  • Responsible for the Data Cleansing of Source Data using IDQ transformations using Labeler, StandardizerTransformation.
  • Expertise in working with various operational sources like Oracle and Mainframe.
  • Have Knowledge on PMON.
  • Performance tuning the long running queries.
  • Developed processes on Teradata using RDBMS utilities such as Fast Load, FastExport, BTEQ Teradata).
  • Involved in understanding requirements and in modeling activities of the attributes identified from different source systems which are in Oracle, Teradata, CSV FILES and Mainframe.
  • Troubleshooting the failure Jobs.
  • Checking the table skewing.
  • Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Tuning and optimization of Teradata Queries.
  • Code review’s in production issue’s.
  • Solved Various Defects in Set of wrapper Scripts which executed the Teradata BTEQ, MLOAD, FLOAD utilities.

Environment: Teradata V2R13, Teradata Queryman, Oracle 8i, Utility: Teradata Loading Utilities (Bteq, Multiload, Fastload, Fast Export),Powercenter9.

Confidential, FL

Teradata Developer.

Responsibilities:

  • Communication with business users and analysts on business requirements.
  • Fulfilled ad-hoc requests coming from superiors.
  • Working on SQL’s to get required data.
  • Helping the Reporting team by providing the Teradata SQL queries.
  • Working on individual work Requests.
  • Involvement in implementation of BTEQ and Bulk load jobs.
  • Worked on PMON,Manager.
  • Created BTEQ scripts to extract data from warehouse for downstream applications.
  • Exporting Data using BTEQ and Fast Export.
  • Execute the SQL’s on UNIX environment. Performance tuning the long running queries.
  • Developed processes on Teradata using RDBMS utilities such as Fast Load, FastExport, BTEQ Teradata).
  • Experience in creating SSIS packages. Experience in DTS Migration and Metadata Management: Migrating DTS packages to SSIS, Package Migration Wizard, and Storage Management.
  • Created SSIS package to load data from Flat File (2GB -- 4GB) to Flat File and Flat File to SQL Server 2008
  • Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Tuning and optimization of Teradata Queries.
  • Worked on DDF for masking data.
  • Created Aggregate Join indexes(AJIs) for OLAP & ROLAP development
  • Code review’s in production issue’s.
  • Monitoring load job’s and issues.
  • Interaction with user’s to work on AST’s.
  • Changing of SQL’s as per required, creating reports to users.

Environment: Teradata V2R12, Teradata Queryman, Oracle 8i, Erwin7.2, Utility: Teradata Loading Utilities (Bteq, Multiload, Fastload, Fast Export), SQL Server 2008.

Confidential

Teradata/Informatica Developer.

Responsibilities:

  • Understanding BRD, Data required document and mapping documents.
  • Experienced in Creating the Data bases and users
  • Space estimations and User maintenance on dev machine
  • Designing the ETLs and conducting review meets.
  • Coded test SQLs and analyzed results.
  • Created ad-hoc reports using SQL server 2005 Reporting Services (SSRS).
  • Involvement in implementation of BTEQ and Bulk load jobs
  • Working on different transformations.(router,Aggregator,Expression)
  • Coding using BTEQ SQL of TERADATA.
  • Solved Various Defects in Set of wrapper Scripts which executed the Teradata BTEQ, MLOAD, FLOAD utilities.
  • Execute the SQL’s on UNIX environment. Performance tuning the long running queries.
  • Worked with Teradata EXPLAIN facility, which describes to end-users how the database system will perform any requests.
  • Migrated all DTS packages to SQL Server Integration Services (SSIS) and modified the packages accordingly using the advanced features of SQL Server Integration Services.
  • Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for User Acceptance testing and loading history data into Teradata.
  • Reduced Teradata space used by optimizing tables - adding compression where Appropriate and ensuring optimum column definitions.
  • Monitoring ETL jobs until production jobs are stabilized.
  • Tuning and optimization of Teradata Queries.
  • Using the UNIX wrappers to make the workflows.

Environment: Teradata V2R6, Teradata Queryman, Informatica V 7.1.2, SQL server 2005, Oracle 8i, Erwin7.2, Utility: Teradata Loading Utilities (Bteq, Multiload, Fastload, FastExport, Tpump).Informatica 8.

We'd love your feedback!