We provide IT Staff Augmentation Services!

Senior Teradata Developer/technical Analyst Resume

5.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • 7+ years of experience in Information technology wif expertise in Teradata Database in Data warehousing Environment, Oracle PL/SQL and informatica.
  • More than 6 years of experience as Teradata Developer and 2 years of experience as ETL developer in Data warehousing Environment
  • Good knowledge in Oracle and Teradata RDBMS Architecture and involved in loading of data into Oracle and Teradata DW from disparate data sources like IBM DB2, Oracle, and SQL Server .
  • Extensive use of data loading and used utility tools like Fastload, Multiload, Fast Export and Tpump.
  • Expertise in Report formatting, Batch processing, Data Loading and Export using BTEQ.
  • Hands on experience in Query performance tuning using Explain plan, Collect Statistics, Compression, NUSI and Join Indexes including Join and Sparse Indexes. Well versed wif understanding of Explain Plans and confidence levels
  • Have profound knowledge and experience in Dimensional Modeling and 3NF using teh modeling tool Erwin.
  • Extracted source data from Mainframe Z/OS using JCL scripts and SQLs into Unix Environment and Created a formatted reports for Business Users using BTEQ scripts.
  • Automated teh BTEQ report generation using Unix scheduling tools on Weekly and Monthly basis.
  • Developed UNIX shell scripts and used BTEQ, FastLoad, Multiload, and Fast Export utilities extensively to load to target database.
  • Created several BTEQ scripts involving derived tables and volatile/Global temporary tables to extract data for several business users on scheduled basis.
  • Involved in teh collection of statistics on important tables to have better plan from Teradata Optimizer.
  • Well versed wif understanding of Explain Plans and confidence levels.
  • Involved in designing and building stored procedures and macros for teh module.
  • Skilled in Data Warehousing Data Modeling using Star Schema and Snowflake Schema..
  • Developed UNIX Shell scripts for Batch processing.
  • Created unix shell scripts for informatica ETL tool to automate sessions and Autosys scheduling in Teradata.
  • Good data warehousing ETL experience of using informatica powercenter client tools like mapping designer, Repository manager and workflow manager.
  • Strong experience in extraction, Transformation and loading (ETL) data from various sources into data warehouse and data marts.
  • Implemented performance tunning logic on Targets, sources, mappings and sessions to provide maximum efficiency and performance
  • Involved in relational and dimensional data modelling techniques to design ERWIN data models.
  • Created and scheduled sessions and batches through teh informatica server manager.

TECHNICAL SKILLS

Databases: Teradata V2R4 / V2R5 / V2R6 / 12, DB2, Oracle 7.x / 8.x / 9.x, MS Access 2000.

ETL Tool: Informatica

Reporting Tools: Business Objects 6.x / 5.x, Seagate Crystal Reports 6.x / 5.x / 4.x / 3.x, SQR 6.x, Forms 6i, Reports 6i, and MS Access Reports.

Programming: PL/SQL, VBA, UNIX Shell Scripting, Teradata SQL, ANSI SQL, Transact SQL, C,SQL, C++, JAVA, HTML, XML, JavaScript, COBOL, JCL

O/S: NCR UNIX 00 / 5500, Sun Solaris 2.6 / 2.7, HP - UX, IBM AIX 4.2 / 4.3, MS-DOS 6.22, Novell NetWare 4.11 / 3.61

PROFESSIONAL EXPERIENCE

Confidential - Atlanta, GA

Senior Teradata Developer/Technical Analyst

Responsibilities:

  • Worked on loading of data from several flat files sources using Teradata FastLoad and MultiLoad.
  • Writing teradata sql queries to join or any modifications in teh table.
  • Transfer of large volumes of data using Teradata FastLoad, MultiLoad and T-Pump.
  • Fined tuned teh existing mappings and achieved increased performance and reduced load times for faster user query performance.
  • Used Trillium for Data Quality checking, data cleansing, data standardization, and matching(Converter, Parser, Geocoder, Matcher)
  • Name parsing and standardization performed through Trillium supplied tables built at Harte Hank’s data processing center
  • Address parsing and correction through Trillium postal tables and geocoding modules
  • Teh file will get sent through teh Trillium Converter, Parser, Geocoder, and Winkey processes to cleanse, parse, and validate address information
  • Developed graphs for teh ETL processes using Join, Rollup, Scan, Normalize, Denormalize and Reformat transform components as well as Partition and Departition components extensive
  • Quickly adapted to Capital One Agile methodology (3 week sprints) and actively participated in Sprint Planning sessions.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads.
  • Sorted data files using UNIX Shell scripting.
  • Fine tuning of Mload scripts considering teh number of loads scheduled and volumes of load data.
  • Used data profiler in ETL processes, Data integrators to ensure teh requirements of teh clients including checks on column property, column value, and referential integrity.
  • Written scripts to extract teh data from Oracle and load into Teradata.
  • Worked datasets for analysis
  • Worked wif SAS on exporting data using Teradata FastExport.
  • Written Teradata BTEQ scripts to implement teh business logic.
  • Hands on wif Teradata Queryman to interface wif teh Teradata.
  • Used SQL Profiler for trouble shooting, monitoring, optimizing SQL Server From developers and testers.
  • Worked on Cognos 8 Suite (Event Studio, Query Studio, Analysis Studio, and Report Studio).
  • Experience in designing, developing and maintaining Cognos Impromptu Catalogs, Power Play Cubes and reports
  • Used UNIX scripts to access Teradata & Oracle Data
  • Developed UNIX shell scripts for data manipulation
  • Involved in writing proactive data audit scripts.
  • Involved in writing data quality scripts for new market integration
  • Developed complex transformation code for derived duration fields.
  • Developed BTEQ scripts to extract data from teh detail tables for reporting requirements.

Environment: NCR 4800/5100, TeradataV12 (BTEQ, FastLoad, MultiLoad, TeradataSQL, FastExport) Mainframes MVS/OS, JCL, TSO/ISPF, COBOL, DB2, SAS, Oracle, PL/SQL, Trillium, Shell-scripting

Confidential - San Jose, CA

Teradata Developer

Responsibilities:

  • Involved in database design/preparing SQL scripts to support teh larger databases that involves terabytes of data.
  • Coordinated wif teh database team to create teh necessary data sources for PSG (Premier Services) and FA (Financial Accounts) using Metadata Utility.
  • Involved in teh design of complex campaigns for teh Business users to accomplish different marketing strategies.
  • Coordinated wif teh test team in teh design of test cases and preparation of test data to work wif different Channels, setup regency and timeout for teh Campaigns.
  • Involved in running teh batch process for Teradata CRM.
  • Creation of BTEQ, Fast export, MultiLoad, TPump, Fast load scripts.
  • Worked on complex queries to map teh data as per teh requirements.
  • Extracted data from various production databases SAS, SYBASE, and Teradata to meet business report needs.
  • Designed and implemented stored procedures and triggers for automating tasks in SQL.
  • Worked on some critical problems like booked metrics and solved them successfully using SQL
  • Interacted wif technical and business analyst, operation analyst to resolve data issues
  • Analyze teh current data movement (ETL (Informatica)) process and procedures.
  • Used Data profiler to allow teh analysis of data directly in teh database, which improves performance, while eliminating teh time and costs of moving data among databases.
  • Identify and assess external data sources as well as internal and external data interfaces
  • Created and updated MS Excel mapping document by doing field level analysis and field mapping.

Environment: TeradataV12, BTEQ, Teradata Manager, TeradataSQL Assistant, FastLoad, MultiLoad, Fast Export, Rational Clear Quest, Control-M, UNIX, MQ, NDM, FTP, SAS, PL/SQL

Confidential, New Berlin, WI

Teradata developer/Informatica

Responsibilities:

  • Migrated tables from Oracle to Teradata.
  • Wrote Bteq and Mload scripts to load data from Oracle to Teradata.
  • Helped V-MIS in preparing usage inventory document.
  • Analyzed teh dependencies of existing job on Oracle data mart (BCMS).
  • Used UNIX/Perl scripts to access Teradata & Oracle Data.
  • Sourced data from Teradata to Oracle using Fast Export and Oracle SQL Loader.
  • Worked on Informatica Power Center tool - Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager and Informatica Server Manager.
  • Informatica Metadata repository was created using teh Repository Manager as a hub for interaction between teh various tools. Security and user management, repository backup was also done using teh same tool.
  • Informatica Designer tools were used to design teh source definition, target definition and transformations to build mappings.
  • Created teh mappings using transformations such as teh Source qualifier, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy.
  • Server Manager used for creating and maintaining teh Sessions and also used to Monitor, edit, schedule, copy, aborts and deletes teh session.
  • Used ETL for efficient mapping and transformation techniques.

Environment: Red Hat Enterprise Linux 64-bit, Windows XP, Informatica Power Center 7.1, NCR TeradataV2R6, Business Objects 11, Crystal Reports 10, VB.

Confidential, Denver, CO

Teradata developer

Responsibilities:

  • Involved in teh analysis of teh Issues and proposing teh solutions to teh client
  • Involved in teh analysis of test results and documenting teh test results.
  • Preparing teh documents in details which will be shared across teh organization
  • Create export scripts using Teradata Fast Export Utility.
  • Preparing Test cases and Test Data
  • Create data manipulation and definition scripts using Teradata Bteq utility Involved in teh analysis and design of teh system
  • Involved in Testing of teh prototype.
  • Create load scripts using Teradata Fast Load and Mload utilities Code Reviews and Code walk through of teh prototype.
  • Create procedures in Teradata SQL.
  • Attending corporate meeting for teh kick off of major enhancements at corporate level.
  • Organizing meeting wif teh SME’s of teh dependent systems when changes are done for teh existing system.
  • Create UNIX scripts for file manipulation.
  • Involved in preparation of architecture and infrastructure documentation.
  • Involved in Designing DR process for teh ESB application
  • Involved in teh Weekly issue meeting wif teh customer.
  • Organizing meeting wif teh department heads when changes are recommended for teh existing system for performance improvement.
  • Organizing meeting wif teh team on daily basis.
  • Must be able to reverse engineer and document existing code
  • Involved in decision making for changing teh existing programs for special processing.

Environment: UNIX, Teradata RDBMS,BTEQ, Fast load, Teradata Administrator, MultiLoad, TPump, Power, FTP, SFTP

Confidential, Minneapolis, MN

Teradata developer/Informatica

Responsibilities:

  • Involved in database design/preparing SQL scripts to support teh larger databases that involves terabytes of data.
  • Create export scripts using Teradata Fast Export Utility.
  • Create data manipulation and definition scripts using Teradata Bteq utility.
  • Involved in teh design and implementation of teh Module
  • Create load scripts using Teradata Fast Load and Mload utilities.
  • Involved in System Testing, UAT, Regression Testing.
  • Carry out teh Enhancement work as per teh MCR given by Client.
  • Created UNIX Scripts to Manipulate and Load teh data.
  • Code Reviews
  • Attending corporate meeting for teh kick off of major enhancements at corporate level.
  • Organizing meeting wif teh SME’s of teh dependent systems when changes are done for teh existing system.
  • Involved in documentation for teh customer deliverables and preparing user documentation.
  • Extensively used various data cleansing and data conversion functions in various transforrmations.
  • Developed various mappings, mapplets and Transformations for migration of data from one systen to new system using informatica power center designer.
  • Extracting and loading of data from flat files, oracle sources to oracle database using tranasformations.
  • Writen shell scripts and control files to load data into staging tables using SQLloader.
  • Performed tunning of sessions in target source, mappings and session area.
  • Involved in debugging informatica mappings, testing of stored procedures and functions.
  • Worked on different OLTP data sources such as oracle, sql server and flat files for data extraction.

Environment: Teradata RDBMS, BTEQ, FastLoad, MultiLoad, FastExport Teradata Manager, Teradata SQL Assistant, Rational Clear Quest, UNIX, MQ, NDM, FTP

Confidential

SQL Developer

Responsibilities:

  • Managing databases, tables, indexes, views, stored procedures.
  • Enforcing business rules wif triggers and user defined functions, troubleshooting, and replication.
  • Writing teh Stored Procedures, checking teh code for efficiency.
  • Maintenance and Correction of Transact Sequel Server (T-SQL) Statements.
  • Daily Monitoring of teh Database Performance and network issues.
  • Administering teh MS SQL Server by Creating User Logins wif appropriate roles, dropping and locking teh logins, monitoring teh user accounts, creation of groups, granting teh privileges to users and groups. SQL Authentication
  • Rebuilding indexes on various tables.

Environment: MS SQL Server 6.5, SQL Server 7, MS SQL Server 2000

We'd love your feedback!