Senior Teradata Developer/technical Analyst Resume
Atlanta, GA
SUMMARY
- 7+ years of experience in Information technology with expertise in Teradata Database in Data warehousing Environment, Oracle PL/SQL and informatica.
- More than 6 years of experience as Teradata Developer and 2 years of experience as ETL developer in Data warehousing Environment
- Good knowledge in Oracle and Teradata RDBMS Architecture and involved in loading of data into Oracle and Teradata DW from disparate data sources like IBM DB2, Oracle, and SQL Server.
- Extensive use of data loading and used utility tools like Fastload, Multiload, Fast Export and Tpump.
- Expertise in Report formatting, Batch processing, Data Loading and Export using BTEQ.
- Hands on experience in Query performance tuning using Explain plan, Collect Statistics, Compression, NUSI and Join Indexes including Join and Sparse Indexes. Well versed with understanding of Explain Plans and confidence levels
- Have profound knowledge and experience in Dimensional Modeling and 3NF using the modeling tool Erwin.
- Extracted source data from Mainframe Z/OS using JCL scripts and SQLs into Unix Environment and Created a formatted reports for Business Users using BTEQ scripts.
- Automated the BTEQ report generation using Unix scheduling tools on Weekly and Monthly basis.
- Developed UNIX shell scripts and used BTEQ, FastLoad, Multiload, and Fast Export utilities extensively to load to target database.
- Created several BTEQ scripts involving derived tables and volatile/Global temporary tables to extract data for several business users on scheduled basis.
- Involved in the collection of statistics on important tables to have better plan from Teradata Optimizer.
- Well versed with understanding of Explain Plans and confidence levels.
- Involved in designing and building stored procedures and macros for the module.
- Skilled in Data Warehousing Data Modeling using Star Schema and Snowflake Schema..
- Developed UNIX Shell scripts for Batch processing.
- Created unix shell scripts for informatica ETL tool to automate sessions and Autosys scheduling in Teradata.
- Good data warehousing ETL experience of using informatica powercenter client tools like mapping designer, Repository manager and workflow manager.
- Strong experience in extraction, Transformation and loading (ETL) data from various sources into data warehouse and data marts.
- Implemented performance tunning logic on Targets, sources, mappings and sessions to provide maximum efficiency and performance
- Involved in relational and dimensional data modelling techniques to design ERWIN data models.
- Created and scheduled sessions and batches through the informatica server manager.
TECHNICAL SKILLS
Databases: Teradata V2R4 / V2R5 / V2R6 / 12, DB2, Oracle 7.x / 8.x / 9.x, MS Access 2000.
ETL Tool: Informatica
Reporting Tools: Business Objects 6.x / 5.x, Seagate Crystal Reports 6.x / 5.x / 4.x / 3.x, SQR 6.x, Forms 6i, Reports 6i, and MS Access Reports.
Programming: PL/SQL, VBA, UNIX Shell Scripting, Teradata SQL, ANSI SQL, Transact SQL, C,SQL, C++, JAVA, HTML, XML, JavaScript, COBOL, JCL
O/S: NCR UNIX 00 / 5500, Sun Solaris 2.6 / 2.7, HP - UX, IBM AIX 4.2 / 4.3, MS-DOS 6.22, Novell NetWare 4.11 / 3.61
PROFESSIONAL EXPERIENCE
Confidential - Atlanta, GA
Senior Teradata Developer/Technical Analyst
Responsibilities:
- Worked on loading of data from several flat files sources using Teradata FastLoad and MultiLoad.
- Writing teradata sql queries to join or any modifications in the table.
- Transfer of large volumes of data using Teradata FastLoad, MultiLoad and T-Pump.
- Fined tuned the existing mappings and achieved increased performance and reduced load times for faster user query performance.
- Used Trillium for Data Quality checking, data cleansing, data standardization, and matching(Converter, Parser, Geocoder, Matcher)
- Name parsing and standardization performed through Trillium supplied tables built at Harte Hank’s data processing center
- Address parsing and correction through Trillium postal tables and geocoding modules
- The file will get sent through the Trillium Converter, Parser, Geocoder, and Winkey processes to cleanse, parse, and validate address information
- Developed graphs for the ETL processes using Join, Rollup, Scan, Normalize, Denormalize and Reformat transform components as well as Partition and Departition components extensive
- Quickly adapted to Capital One Agile methodology (3 week sprints) and actively participated in Sprint Planning sessions.
- Creation of customized Mload scripts on UNIX platform for Teradata loads.
- Sorted data files using UNIX Shell scripting.
- Fine tuning of Mload scripts considering the number of loads scheduled and volumes of load data.
- Used data profiler in ETL processes, Data integrators to ensure the requirements of the clients including checks on column property, column value, and referential integrity.
- Written scripts to extract the data from Oracle and load into Teradata.
- Worked datasets for analysis
- Worked with SAS on exporting data using Teradata FastExport.
- Written Teradata BTEQ scripts to implement the business logic.
- Hands on with Teradata Queryman to interface with the Teradata.
- Used SQL Profiler for trouble shooting, monitoring, optimizing SQL Server From developers and testers.
- Worked on Cognos 8 Suite (Event Studio, Query Studio, Analysis Studio, and Report Studio).
- Experience in designing, developing and maintaining Cognos Impromptu Catalogs, Power Play Cubes and reports
- Used UNIX scripts to access Teradata & Oracle Data
- Developed UNIX shell scripts for data manipulation
- Involved in writing proactive data audit scripts.
- Involved in writing data quality scripts for new market integration
- Developed complex transformation code for derived duration fields.
- Developed BTEQ scripts to extract data from the detail tables for reporting requirements.
Environment: NCR 4800/5100, TeradataV12 (BTEQ, FastLoad, MultiLoad, TeradataSQL, FastExport) Mainframes MVS/OS, JCL, TSO/ISPF, COBOL, DB2, SAS, Oracle, PL/SQL, Trillium, Shell-scripting
Confidential - San Jose, CA
Teradata Developer
Responsibilities:
- Involved in database design/preparing SQL scripts to support the larger databases that involves terabytes of data.
- Coordinated with the database team to create the necessary data sources for PSG (Premier Services) and FA (Financial Accounts) using Metadata Utility.
- Involved in the design of complex campaigns for the Business users to accomplish different marketing strategies.
- Coordinated with the test team in the design of test cases and preparation of test data to work with different Channels, setup regency and timeout for the Campaigns.
- Involved in running the batch process for Teradata CRM.
- Creation of BTEQ, Fast export, MultiLoad, TPump, Fast load scripts.
- Worked on complex queries to map the data as per the requirements.
- Extracted data from various production databases SAS, SYBASE, and Teradata to meet business report needs.
- Designed and implemented stored procedures and triggers for automating tasks in SQL.
- Worked on some critical problems like booked metrics and solved them successfully using SQL
- Interacted with technical and business analyst, operation analyst to resolve data issues
- Analyze the current data movement (ETL (Informatica)) process and procedures.
- Used Data profiler to allow the analysis of data directly in the database, which improves performance, while eliminating the time and costs of moving data among databases.
- Identify and assess external data sources as well as internal and external data interfaces
- Created and updated MS Excel mapping document by doing field level analysis and field mapping.
Environment: TeradataV12, BTEQ, Teradata Manager, TeradataSQL Assistant, FastLoad, MultiLoad, Fast Export, Rational Clear Quest, Control-M, UNIX, MQ, NDM, FTP, SAS, PL/SQL
Confidential, New Berlin, WI
Teradata developer/Informatica
Responsibilities:
- Migrated tables from Oracle to Teradata.
- Wrote Bteq and Mload scripts to load data from Oracle to Teradata.
- Helped V-MIS in preparing usage inventory document.
- Analyzed the dependencies of existing job on Oracle data mart (BCMS).
- Used UNIX/Perl scripts to access Teradata & Oracle Data.
- Sourced data from Teradata to Oracle using Fast Export and Oracle SQL Loader.
- Worked on Informatica Power Center tool - Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager and Informatica Server Manager.
- Informatica Metadata repository was created using the Repository Manager as a hub for interaction between the various tools. Security and user management, repository backup was also done using the same tool.
- Informatica Designer tools were used to design the source definition, target definition and transformations to build mappings.
- Created the mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy.
- Server Manager used for creating and maintaining the Sessions and also used to Monitor, edit, schedule, copy, aborts and deletes the session.
- Used ETL for efficient mapping and transformation techniques.
Environment: Red Hat Enterprise Linux 64-bit, Windows XP, Informatica Power Center 7.1, NCR TeradataV2R6, Business Objects 11, Crystal Reports 10, VB.
Confidential, Denver, CO
Teradata developer
Responsibilities:
- Involved in the analysis of the Issues and proposing the solutions to the client
- Involved in the analysis of test results and documenting the test results.
- Preparing the documents in details which will be shared across the organization
- Create export scripts using Teradata Fast Export Utility.
- Preparing Test cases and Test Data
- Create data manipulation and definition scripts using Teradata Bteq utility Involved in the analysis and design of the system
- Involved in Testing of the prototype.
- Create load scripts using Teradata Fast Load and Mload utilities Code Reviews and Code walk through of the prototype.
- Create procedures in Teradata SQL.
- Attending corporate meeting for the kick off of major enhancements at corporate level.
- Organizing meeting with the SME’s of the dependent systems when changes are done for the existing system.
- Create UNIX scripts for file manipulation.
- Involved in preparation of architecture and infrastructure documentation.
- Involved in Designing DR process for the ESB application
- Involved in the Weekly issue meeting with the customer.
- Organizing meeting with the department heads when changes are recommended for the existing system for performance improvement.
- Organizing meeting with the team on daily basis.
- Must be able to reverse engineer and document existing code
- Involved in decision making for changing the existing programs for special processing.
Environment: UNIX, Teradata RDBMS,BTEQ, Fast load, Teradata Administrator, MultiLoad, TPump, Power, FTP, SFTP
Confidential, Minneapolis, MN
Teradata developer/Informatica
Responsibilities:
- Involved in database design/preparing SQL scripts to support the larger databases that involves terabytes of data.
- Create export scripts using Teradata Fast Export Utility.
- Create data manipulation and definition scripts using Teradata Bteq utility.
- Involved in the design and implementation of the Module
- Create load scripts using Teradata Fast Load and Mload utilities.
- Involved in System Testing, UAT, Regression Testing.
- Carry out the Enhancement work as per the MCR given by Client.
- Created UNIX Scripts to Manipulate and Load the data.
- Code Reviews
- Attending corporate meeting for the kick off of major enhancements at corporate level.
- Organizing meeting with the SME’s of the dependent systems when changes are done for the existing system.
- Involved in documentation for the customer deliverables and preparing user documentation.
- Extensively used various data cleansing and data conversion functions in various transforrmations.
- Developed various mappings, mapplets and Transformations for migration of data from one systen to new system using informatica power center designer.
- Extracting and loading of data from flat files, oracle sources to oracle database using tranasformations.
- Writen shell scripts and control files to load data into staging tables using SQLloader.
- Performed tunning of sessions in target source, mappings and session area.
- Involved in debugging informatica mappings, testing of stored procedures and functions.
- Worked on different OLTP data sources such as oracle, sql server and flat files for data extraction.
Environment: Teradata RDBMS, BTEQ, FastLoad, MultiLoad, FastExport Teradata Manager, Teradata SQL Assistant, Rational Clear Quest, UNIX, MQ, NDM, FTP
Confidential
SQL Developer
Responsibilities:
- Managing databases, tables, indexes, views, stored procedures.
- Enforcing business rules with triggers and user defined functions, troubleshooting, and replication.
- Writing the Stored Procedures, checking the code for efficiency.
- Maintenance and Correction of Transact Sequel Server (T-SQL) Statements.
- Daily Monitoring of the Database Performance and network issues.
- Administering the MS SQL Server by Creating User Logins with appropriate roles, dropping and locking the logins, monitoring the user accounts, creation of groups, granting the privileges to users and groups. SQL Authentication
- Rebuilding indexes on various tables.
Environment: MS SQL Server 6.5, SQL Server 7, MS SQL Server 2000