We provide IT Staff Augmentation Services!

Teradata Consultant Resume

Bellevue, WA

SUMMARY:

  • More than 12 years of experience in Information Technology with extensive involvement in Business Intelligence, Data Warehousing, ETL, Teradata Architecture, Data Analysis, Data Modeling and Development assignments.
  • Worked as Teradata Consultant, Developer, Technical Architect, Technical Lead.
  • Diverse industry experience, including Financial & Banking, Retail, Healthcare, Telecom and Insurance.
  • Proficient in Project Management Methodologies Waterfall, Critical Path Management and Agile Enviroments.
  • Thoroughly proficient in Complete SDLC of Teradata Enterprise Data Warehouses to help solve client business challenges and increase profits.
  • Experienced in Data Architecture Design and Review processes, Scope Analysis, Resource Planning, and budgeting the projects.
  • Experienced in Performance tuning, Application Development, and Application Support on UNIX, MVS and WINDOWS NT Environments.
  • Performance Monitoring, SQL Query Tuning Expertise in Teradata Utilities BTEQ, FastExport, FastLoad, MultiLoad, Tpump and SQLAssistant.
  • Built Data warehouse and marts using Ralph Kimball and WH Inmon design approaches.
  • Expert in data warehousing techniques for ETL/ELT, Data cleansing, Surrogate key assignment, Slowly Changing Dimension (SCD).
  • Worked on Informatica Designer Components - Source Analyzer, Warehousing Designer, Transformations Developer, Mapplet and Mapping Designer.
  • Experience in deployment of Business Objects & Web Intelligence from Various Sources Oracle, Teradata, MS SQL Server, DB2, Sybase and Access.
  • Worked on Analytical and Operational Data Marts Model.
  • Proficient in Autosys Batch jobs using UNIX Shell scripts, creating webdocs and Visio diagrams to run jobs automatically.
  • Highly adaptive to a team environment and proven ability to work in a fast paced teaming environment. Excellent track record in end-to-end implementation and integration projects to successful completion under tight deadlines.
  • Strong communications skills, systematic approach and quick adaptability to new technologies and new working environment.
  • Good knowledge on Hadoop.

TECHNICAL SKILLS:

Operating Systems: Unix/Linux, Windows, MVS (OS/390), IBM z/OS.

Databases: Teradata 15.10, 14.10, 13.10, Teradata 12.0, V2R6/V2R5.0, Oracle (8i/9i/10g/11g), SQL server2000, 2005, DB2, Sybase, MS Access 2010.

Teradata Tools & Utilities: Query Facilities SQL Assistant, BTEQ, Teradata Administrator, Teradata Studio Express and Teradata View point. TPTLoad & Export: FastLoad, MultiLoad, Fast Export, BTEQ, OLE load, Oracle Sql Loader

ETL: Informatica 7.1, 8.0, 8.6, 9.1, Datastage 7.5.

Languages: C, C++, COBOL, Java, Visual Basic, ASP.Net, JCL, SQL

Scripting Languages: UNIX Shell, JCL, Java Script, VB Script, Perl, Python

Other: NCR 4800/5500,5600,6700,6750 Teradata Administrator, Teradata View point, IBM TM1, Cognos, Business Objects, Erwin, AutoSys, Git Hub, MS Visio 2013, HP QUALITYCENTER, Tableau.Hadoop.

PROFESSIONAL EXPERIENCE:

Confidential, Bellevue, WA

Teradata Consultant

Responsibilities:

  • Working for Engineering team as Teradata consultant to support Financial Analytics.
  • Involved in complete Project Life Cycle from Analysis, Design & Development to go-live.
  • Responsible for end to end Teradata design, development to support Financial Analytics Tableau Dashboard.
  • Work with Project Management, Teradata Architects, SQL Server Architects, Business Solution Analysts & with Business Users to understand & Design the enterprise data warehouse solutions.
  • Responsible for post deployment user training and create complete documentation to handoff the solution to Production Support teams.
  • Created Unix Shell scripts using TDCH Connector to load data from HDFS Hive to Teradata.
  • Used OLE DB Access Module and Teradata Parallel Transporter to load data from SQL Server to Teradata.
  • Created various Views, Lookup tables, Aggregated Tables, Macros and developed BTEQ scripts to extract data from the detail tables to aggregated tables to support analytics dashboard.
  • Involved in Designing and implementing join indexes, aggregate join indexes, secondary indexes, compression, statistics, and SQL re-writes to improve the Performance.
  • Hands on with MS Visio for Batch Flow diagrams and created XML documents to design Autosys Batch Jobs.
  • Used Teradata Viewpoint for Query Performance monitoring.
  • Hands on with Teradata Queryman & Administrator to interface with the Teradata.
  • Involved in writing proactive data audit scripts, multiple test cases.

Environment: Teradata 15.10, (Queryman, Administrator, Teradata Viewpoint, Teradata Studio Express, BTEQ, FastLoad, MultiLoad, FastExport), FTP, Oracle, SQL Server, TDCH, TPT, UNIX Shell-scripting, Hadoop, SQL Hive.

Confidential, BEAVERTON, OR

Sr Teradata Developer/ Technical Lead/ Data Architect

Responsibilities:

  • Working for BIE team, developed deployed various solutions in Major & Minor Iterative release paths in Waterfall, CPM and Agile to support Enterprise Information System.
  • Involved in complete Project Life Cycle from Analysis, Design & Development to go-live.
  • Responsible for converting BRD to Technical Design, Project Planning, Estimating the development efforts and System Capacity.
  • Participate daily standup scrum meetings and Sprint Planning Sessions.
  • Developed and deployed multiple projects for various Capability Groups like Sales, Order Fulfillment Services, Enterprise Financial Services, & Global Supply Chain Performance Analytics to support the front-end Reporting & Planning Applications like COGNOS, TM1 & Tableau Dashboards.
  • Work with Project Management, Release Management, Scrum Masters, Architects, Data Modeling Team, Cognos Framework Modelers, TM1 Developers, Tableau consultants, Business Solution Analysts & with Business Users to understand & Design the enterprise data warehouse solutions.
  • Responsible for complete change control process from request intake, estimating the change request, Development, Code Reviews Testing and migrating the work to QA & Production Environment.
  • Responsible for post deployment user training and create complete documentation to handoff the solution to Production Support teams.
  • Create STMs to extract data from Legacy system and from Staging layer to provisioning layer and from staging layer to Provisioning as per business requirements.
  • Created various Views, Lookup tables, Aggregated Tables, Macros, Stored Procedures to support different Business Capability Groups for their Cognos Framework Reporting Applications & TM1 Planning Applications.
  • Worked on User Maintained Data Base Framework to support Slowly Changing Dimensions, Relationship tables, Facts, Dimensions.
  • Developed BTEQ scripts to extract data from the detail tables for upstream IBM TM1 Application a simplified reporting application for Financial Planning.
  • Created customized MLoad scripts on UNIX platform from SAP BW Staging to Teradata Staging Layer loads using Informatica 9.1.
  • Involved in Designing and implementing join indexes, aggregate join indexes, secondary indexes, compression, statistics, and SQL re-writes to improve the Performance.
  • Created complex SQLs and Lookup tables to provide the same fact data solutions with different logical dimensions for various regions as the business and marketing requirements vary across the regions for Nike’s Business Planning.
  • Fine-tuned the existing mappings and achieved increased performance and reduced load times for faster user query performance
  • Hands on with MS Visio for Batch Flow diagrams and created XML documents to design Autosys Batch Jobs.
  • Used Teradata Viewpoint for Query Performance monitoring.
  • Hands on with Teradata Queryman & Administrator to interface with the Teradata.
  • Involved in writing proactive data audit scripts, multiple test cases.

Environment: Teradata 15.10, Teradata 14.10, Teradata 13.10, Teradata 12.0, (Queryman, Administrator, Teradata Viewpoint, Teradata Studio Express, BTEQ, FastLoad, MultiLoad,FastExport), FTP, Oracle, Informatica Power Center 9.1,8.6, Linux, Sun Solaris UNIX Shell-scripting, MS Visio 2003, 2007,2013, IBM Cognos TM1, IBM Cognos 10,Tableau 8.1

Confidential, Columbus, OH

Teradata Developer

Responsibilities:

  • Worked on Balancing Process of Enterprise Data Warehouse. Thoroughly studied HYBRID Model of Enterprise Data Warehouse. Closely worked with ETL & Data Modeling teams to identify and updating the balancing defects.
  • Written BTEQ scripts to Identify the Balancing defects between Fact & Summary tables from various source systems including Exclusive Channels and Independent Channels.
  • Written COBOL Programs to extract the data from DB2. Created Unix Shell Scripts to load the data into Data Warehouse. Used FTP to load the source data into Data Warehouse using BTEQ & MLoad.
  • Created of customized Mload scripts on UNIX platform for Teradata loads using Informatica. Created JCL and Control Cards to run jobs in batch mode. Hands on with Teradata Queryman to interface with the Teradata.
  • Created Balancing Check Scripts in BTEQ to identify the defects while loading the data from various sources and Created MLoad scripts to load the defective data and Validated Fact & Summarized tables and Used FastExport to send the validated data to the Business.
  • Used Xmit and NDM to send files and PDS from production to development box or vice-versa.
  • Created Join indexes and Temporary tables to improve the query processing.
  • Fined tuned the existing mappings and achieved increased performance and reduced load times for faster user query performance
  • Played an important role in upgrading the Informatica 8.6 with 8.0 to take better advantage of Teradata load utilities.
  • Created Unix Shell Scripts to load the data into Data Warehouse
  • Creation of customized Mload scripts on UNIX platform for Teradata loads using Informatica.
  • Created JCL and Control Cards to run jobs in batch mode.
  • Fine tuning of Mload scripts considering the number of loads scheduled and volumes of load data.
  • Acted as a single resource with sole responsibility of Informatica - Teradata conversions.
  • Written scripts to extract the data from Oracle and load into Teradata.
  • Worked on exporting data using Teradata FastExport.
  • Written Teradata BTEQ scripts to implement the business logic.
  • Hands on with Teradata Queryman to interface with the Teradata.
  • Involved in writing data quality scripts for new acquisitions integration
  • Developed complex transformation code for derived duration fields.
  • Developed BTEQ scripts to extract data from the detail tables for reporting requirements.

Environment: NCR 4800 /5500, Teradata 12.0 (BTEQ, FastLoad, MultiLoad, T-SQL, FastExport) IBM z/OS, FTP, RACF, FTP, JCL, TSO/ISPF, COBOL, DB2, Oracle, Netezza, Informatica Power Center 8.6, 8.0, Sun Solaris UNIX Shell-scripting

Confidential, San Bruno, CA

Teradata Developer

Responsibilities:

  • Worked on loading of data from several flat files sources using Teradata FastLoad and MultiLoad.
  • Transfer of large volumes of data using Teradata FastLoad, MultiLoad and T-Pump.
  • Database-to-Database transfer of data (Minimum transformations) using OleLoad.
  • Written Cobol Programs to extract the data from DB2.
  • Fine tuned the existing mappings and achieved increased performance and reduced load times for faster user query performance
  • Developed ETL jobs using DataStage v7.5 (parallel) for system testing, and executed system test scenarios.
  • Involved in low-level design and developed various jobs and performed data loads and transformations using different stages of DataStage and pre-built routines, functions and macros.
  • Created Join indexes and Temporary tables to improve the query processing.
  • Used stages to make joins between tables like pure inner join, left outer join and right outer joins to leverage data from flat files and SQL server 2000 using ETL Ascential DataStage7.5
  • Performed Unix Shell Scripting to format data between Oracle Tables and Teradata RDBMS.
  • Fine tuning of Mload scripts considering the number of loads scheduled and volumes of load data.
  • Written scripts to extract the data from Oracle and load into Teradata.
  • Worked on exporting data using Teradata FastExport.
  • Written Teradata BTEQ scripts to implement the business logic.
  • Hands on with Teradata Queryman to interface with the Teradata.
  • Involved in writing proactive data audit scripts.
  • Involved in writing data quality scripts for new market integration.
  • Developed COBOL modules to apply transformation to the source data at load time.
  • Developed complex transformation code for derived duration fields.
  • Extensive use of T-SQL case statements to flag the detail records on the warehouse tables.
  • Developed BTEQ scripts to extract data from the detail tables for reporting requirements.

Environment: NCR 4800/5100, Teradata V2R5 (BTEQ, FastLoad, MultiLoad, T-SQL, FastExport) Unix, Windows, OSD/390, MVS, JCL, TSO/ISPF, COBOL, DB2, Oracle, SQL Server 2005, Ascential DataStage, Unix Shell-scripting.

Confidential, Iselin, NJ

Developer - Teradata / Informatica

Responsibilities:

  • Interacted with business community and gathered requirements based on changing needs.
  • Designed the logical Data Model using Erwin and transformed Logical model to Physical database using Power Designer.
  • Data Mart and Dimensional Modeling, Star and Snow Flake Schema Modeling for Data Warehouse.
  • Developed mappings/scripts to extract data from Oracle, Flat files, SQL Server, DB2 and load into data warehouse using the Mapping Designer, BTEQ, FastLoad and MultiLoad.
  • Exported data from Teradata database using FastExport and BTEQ.
  • Wrote SQL Queries, Triggers, Procedures, Macros, Packages and Shell Scripts to apply and maintain the Business Rules.
  • Extensively used OLAP function such as Rank, Csum, MSum, group by grouping set etc.
  • Coded and implemented PL/SQL packages to perform batch job scheduling.
  • Performed Teradata and Informatica tuning to improve the performance of the Load.
  • Performed error handling using error tables and log files.
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
  • Performed DML and DDL operation with the help of SQL transformation in Informatica.
  • Created simple reports using Business Objects.
  • Version Control, Deployment Groups, Labels, Queries were used to manage Metadata.
  • Collaborated with Informatica Admin in process of Informatica Upgradation from PowerCenter 7.1 to PowerCenter 8.1.
  • Used SQL Transformation to sequential loads in Informatica Power Center for ETL processes.
  • Worked closely with the business analyst’s team in order to solve the Problem Tickets, Service Requests. Helped the 24/7 Production Support team.

Environment: Teradata V2R5, Informatica PowerCenter 8.1/7.1, SQL Transformation, Oracle 9i, Business Objects, PL/SQL, SQL Server, Teradata SQL Assistant, SQL Developer, Windows XP Professional, UNIX, Business Objects 6.0, Erwin, Control M.

Confidential

Informatica Consultant

Responsibilities:

  • Designed Sources to Targets mapping from primarily Flat files to Oracle using Informatica Power center.
  • Involved in the Dimensional Data Modeling and populating the business rules using mappings into the Repository for Data management.
  • Involved in developing source to target mappings and scheduling Informatica sessions
  • Various kinds of the transformations were used to implement simple and complex business logic. Transactions used were: connected & unconnected lookups, Router, Expressions, source qualifier, aggregators, filters, sequence generator, etc.
  • Extensively worked in Oracle SQL, PL/SQL, Query performance tuning, Created DDL scripts, Created database objects like Tables, Indexes, Synonyms, and Sequences etc.,.
  • Tuned Informatica Mappings and Sessions for optimum performance
  • Worked in writing UNIX scripts, SQL statements and interacted with development and production teams to expedite the process.

Environment: Informatica Power Center 6.2, Business Objects5.1, DB2, Sybase, Windows server 2003, SQL, PL/SQL .

Hire Now