Teradata Consultant / Vendor Team Captain Resume
Atlanta, GA
PROFESSIONAL SUMMARY:
- Over 10+ Years of IT experience in development and design of ETL methodology for supporting data transformations & processing in a corporate wide ETL Solution using Informatica, Teradata and Oracle
- Dedicated and Focused team player with exceptional data warehouse experience
- Very good experience in Implementation of Data warehouses and Data Marts with Star and Snow flake Schemas.
- Proven track record in planning, building, and managing successful large - scale Data Warehouse and decision support systems Comfortable with both technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation and Data Staging
- Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema (Fact Tables, Dimension Tables) used in relational, dimensional and multidimensional modeling
- Expert in Teradata RDBMS, initial Teradata DBMS environment setup, development and production DBA support, use of FASTLOAD, MULTILOAD, TPUMP, and Teradata SQL and BTEQ Teradata utilities.
- Extensive experience in design and development of Data warehouses and Data Marts using as Informatics Power Center / Power Mart 9.5/8.5.x/7.1.x/6.1.x and associated tool to map, transform and extract data feeds from various source platforms and databases
- Expertise in Query Analyzing, performance tuning and testing.
- Experience in writing database scripts such as SQL queries, PL/SQL Stored Procedures, Indexes, Functions, Views, Materialized views and Triggers
- Experience in writing UNIX shell scripts and Mainframe JCL to support and automate the ETL process
- Worked on Ataccama for ABC checks
- Built Control M jobs for scheduling.
- Periodical Interaction with Client to resolve mapping issues
- Created test data to ensure successful data loading.
- Having strong Retail domain and Communication background experience
- Excellent communication and inter personnel skills, Proactive, Dedicated and Enjoy learning new Technologies and Tools.
- Strong commitment towards quality, experience in ensuring compliance to coding standards and review process.
TECHNICAL SKILLS:
ETL Tools: Teradata Utilities (FastLoad, MultiLoad, Fast Export, BTEQ, TPump and SQL assistant), Informatica PowerCenter 9.x/8.x/7.x/6.x/5.x
Business Intelligence Tools: Microstrategy 9, OBIEE 11g/12c, SQL Server data tools 2015,Business Objects 6.5.1,6.5, Crystal Reports, Ataccama
Languages: SQL, PL/SQL, MySql, SQL*Plus, C, C++, UNIX Shell scripting,Mainframe JCL
Databases: Oracle 8.i/9.i/10g, SQL Server, Teradata 15/14/13/12/V2R6/V2R5, DB2,MS Access 2000
Oracle Utilities/Tools: Toad, SQL* Loader, PL/SQL, Erwin
Operating Systems: Windows 98/2000/NT/XP, UNIX, MS-Dos,Mainframes
PROJECT EXPERIENCE:
Confidential, Atlanta, GA
Teradata Consultant / Vendor Team Captain
Responsibilities:
- Spearheaded a team in a fast paced Agile environment with 2 weeks sprint methodology
- Actively participated in understanding business requirements, analysis and designing ETL process.
- Effectively applied all the business requirements and transforming the business rules into mappings.
- Translate customer requirements into formal requirements and design documents.
- Understand the Source to Target Mappings and talk to Data Architects to clarify the requirements
- Developed Mappings between source systems and Warehouse components along with Data Architects.
- Understand the source data that got loaded into staging tables through spark streaming.
- Created BTEQ’S for loading the staging data into Image tables
- Created Job definition files which will help the master controller setup the sequence of loading from defining the dates to image table to core table load based on kind of transformation.
- Once the data gets loaded into core tables, the quality of the data is validated by Audit, Balance and Control Checks and Data validation tool.
- Data validation checks include PK checks, row counts, duplicate record checks and others
- Worked on using Ataccama as the Data validation tool to complement the existing DVT
- Created Control M jobs using Control M workload Automation. Created quantitative resources and set the dependencies between the jobs.
- Developed strategies for Incremental data extractions as well data migration to load into the Teradata.
- Developed UNIX shell scripts to run batch jobs in production.
- Worked on fine tuning SQL queries and created indexes to help the queries run faster.
- Involved in the performance tuning for fast retrieval of data.
- Effectively involved in user accepting testing and preparing test cases.
- Involved in code reviews.
- Worked on GitHub, Jenkins for code migration from Dev to QA environment
- Filled in Team Captain shoes and piloted the team
Environment: Teradata 15.10 (Fast Export, BTEQ and SQL Assistant), UNIX, Control-M workload Automation 9.0.00, Spark Streaming, GitHub, Jenkins for CI/CD, Rally.
Confidential, Richardson, TX
Teradata Lead
Responsibilities:
- Worked closely with Business to work on a Project called SDM and SPEDM Gap Analysis
- Provided current data level Architecture to the client about the flow of data and how the reports are being sent from different systems to different Pharmas
- Identified issues with Reconciliation of data and presented it to the executive team
- Identified the pain points within each and every system pertaining to the reports
- Spearheaded meetings with other source systems and Intermediate systems and worked on the process flows and exception analysis
- Open issue Analysis
- Performed the AS - IS analysis between Oracle and Teradata systems for the data flowing into the system
- Converted Oracle queries to Teradata queries and compared the outputs of both the queries for gap analysis
- Performed the technical comparison of the systems which included Size, availability of system, number of tables, ETL jobs, CDC, Error handling capability, Reporting layer, Report count, user count, Report SLA to name some.
- Compared the subject areas and attributes across different systems
- Provided Short Term and Long Term Recommendations.
- Started Learning Hadoop as part of Knowledge gaining Program to work on the new technologies.
Environment: Teradata15 (Multi Load, Fast Export, BTEQ and Teradata SQL assistant), Informatica 9.5.1, Unix, MS Access, Oracle.
Confidential, Atlanta, GA
Teradata Lead
Responsibilities:
- Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
- Development of an automated process for loading the data into the base tables in EDW (for ETL) using FastLoad, MultiLoad, BTEQ utilities of Teradata. This includes Writing MultiLoad scripts, FastLoad and Bteq scripts for loading the data from the extract files into the Staging tables
- Developed Scripts to load the data from source to staging and staging area to target tables using different load utilities like Bteq, FastLoad and MultiLoad.
- Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.
- Developed Mappings between source systems and Warehouse components.
- Loading Data from Flat files to Teradata enterprise Data Warehouse using MLOAD Utility.
- Create and maintain database tables and views and Index management and collected statistics.
- Performance tuning which includes query optimization, space management, sqlassistant etc.
- Developed explained plan details for better understanding.
- Created new summary tables from Base tables
- Created views on those summary tables
- Created Parameterized Macros on those views as expected by POWERRASR tool
- Created multiple Global temporary tables and Volatile tables to be used in those Parameterized Macros
- Used DIAGNOSTIC HELPSTATS ON FOR SESSION extensively to identify all the columns on which the stats needs to be collected
- Used Rollup and Cube functions in order to get the data in terms of how the users wants to see the data on the reports
- Created Transformations
- Preparing Test Cases and performing Unit Testing.
- Review of Unit and Integration test cases.
- Production Implementation and Post Production Support
- Generate weekly, monthly reports for internal and external database and system audit
Environment: Teradata15 (MultiLoad, Fast Export, BTEQ and Teradata SQL assistant), Unix, POWERRASR.
Confidential, Atlanta, GA
Teradata Lead and BI reporting Lead
Responsibilities:
- Designed and developed matrix and tabular reports with drill down, drill through and drop down menu option using SSRS.
- Created Ad-Hoc Reports, Summary Reports, Sub Reports, and Drill-down Reports using SSRS.
- Scheduled reports for daily, weekly, monthly reports for executives, Business analyst and customer representatives for various categories and regions based on business needs using SQL Server Reporting services (SSRS).
- Created Cross-Tab and Sub-Report using RDL and promoted RDLs to Reporting Service Server.
- Problem identification, troubleshooting, resolution, and index tuning databases.
- Worked on querying data and creating on-demand reports using Report Builder in SSRS reports and send the reports via email.
- Developed complex queries and views to generate various Drill-through reports, parameterized reports and linked reports using SSRS.
- Created data driven subscriptions
- Implemented SSRS roles (Browser, Content Manager, System Administrator, My Reports, Publisher ) for various Business reports in a multi-user environment
- Developed Metadata repository (.rpd) that consists of Physical, BMM and Presentation layer, Security, test and validate the business model & repository level calculations using Oracle BI Administration tool.
- Worked closely with OBIEE developers to create various reports. Created views on summary tables and helped in building the reports in OBIEE
- Used alias to create self joins
- Helped in converting snowflake model to Dimensional Model as OBIEE works better with dimensional modelling whereas datawarehouse is in Snowflake model. Microstrategy however worked well with snowflake model. SSRS doesn’t need modelling.
- Built Business model and established relationships & Foreign Keys Physical & Logical between tables.
- Created business reports using BI Answers as per requirements.
- Generated various Analytics Reports using global and local filters.
- Identified the aggregate levels for each source and created dimension sources for each level of aggregates.
Environment: Teradata15 (MultiLoad, Fast Export, BTEQ andTeradata SQL assistant), Unix, OBIEE 11g, SSRS 2008/2014.
Confidential, Atlanta, GA
Teradata and Microstrategy Lead
Responsibilities:
- Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
- Involved in Data Modeling to identify the gaps with respect to business requirements and transforming the business rules into mappings.
- Translate customer requirements into formal requirements and design documents.
- Development of an automated process for loading the data into the base tables in EDW (for ETL) using MultiLoad, BTEQ utilities of Teradata. This includes Writing MultiLoad scripts and Bteq scripts for loading the data from the extract files into the temporary tables in the temporary database.
- Developed Scripts to load the data from source to staging and staging area to target tables using different load utilities like Bteq and MultiLoad.
- Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.
- Developed Mappings between source systems and Warehouse components.
- Loading Data from Flat files to Teradata enterprise Data Warehouse using MLOAD Utility.
- Create and maintain database tables and views and Index management and collected statistics.
- Performance tuning which includes query optimization, space management, sqlassistant etc.
- Developed explained plan details for better understanding.
- Extensively worked with Metrics, Facts, Transformations, Custom Groups & Report Services.
- Responsible for creation of Canned/Ad-hoc Reports, Metrics with Public and Schema Objects for the requirements.
- Gathered requirements from management users for generation of reports in Web and Desktop.
- Once ETL is done and data gets loaded to Data warehouse, Defined and created MicroStrategy objects including Attributes, Filters, Cubes, Nested Filters, Hierarchies, Templates, Prompts & Nested Prompts.
- Conditional formatting on metric thresholds, limiting report data access using security filters.
- Developed Customized and Ad-hoc reports in Grid & Graph mode using Templates, Metrics, Filters, Custom Groups, Consolidations and Transformations - The end-users were able to generate Ad-hoc Report by Drilling up, down, within and across dimension or anywhere.
- Execute periodic -- weekly & monthly standard reports consisting of data analysis by combining a template with filters using MicroStrategy desktop and web.
- Moved Objects from one environment to other
- Created Ad-hoc reports with Dynamic prompt answers for the end users to generate.
- Improved report performance using VLDB properties, Caching.
- Implemented Hierarchies by defining Parent-child relationships.
- Created grid, graph and heat map Visualizations.
Environment: Teradata 14 (FastLoad, MultiLoad, Fast Export, BTEQ andTeradata SQL assistant), Unix, Microstrategy 9(Architect, Developer).
Confidential, Atlanta, GA
Teradata and Microstrategy Developer
Responsibilities:
- Involved in Designing the ETL process to Extract transform and load data from OLAP to Teradata data warehouse.
- Used BTEQ and SQL Assistant front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Involved in the performance tuning of ETL code review and analyze the target based commit interval for optimum session performance.
- Responsible for Tuning Report Queries and ADHOC Queries.
- Wrote transformations for data conversions into required form based on the client requirement using Teradata ETL processes.
- Experienced in Tuning SQL Statements and Procedures for enhancing the load performance in various schemas across databases. Tuning the queries to improve the performance of the report refresh time.
- Provide ongoing support by developing processes and executing object migrations, security and access privilege setup and active performance monitoring.
- Used Fast Export utility to extract large volumes of data at high speed from Teradata warehouse.
- Performance tuning for TERADATA SQL statements using huge volume of data.
- Created UNIX scripts for various purposes like FTP, Archive files and creating parameter files.
- Scripts were run through UNIX shell programs in Batch scheduling.
- Designed Interactive Dashboards in Microstrategy Web to assess vendor's performance and monitored the overall goals of the business.
- Creating Periodic, quarterly and yearly Key Performance Indicator (KPI) Reports on key areas of business.
- Developed new reports and converted existing reports for the business line to satisfy the specifications using Microstrategy Desktop.
- Designed Freeform SQL reports for meeting specific needs.
- Production support, trouble-shooting and fixing of reports was done in record time.
- Created Objects like Object Prompts, Documents & Reports.
- Extensively worked in developing Metrics (Dimensional metrics, Compound metrics), Filters, Custom groups and Consolidations for complex reports.
- Extensively used report services to create dashboards to create Summary level reports using multiple datasets.
- Involved in System Testing, User Acceptance Testing
- Defect Fixing during the SIT and UAT phase.
Environment: Teradata 13 (FastLoad, MultiLoad, Fast Export, BTEQ and Teradata SQL assistant), Unix, Microstrategy 8.1
Confidential, Atlanta, GA
Teradata Developer
Responsibilities:
- Worked closely with clients for gathering the functional requirements and working with the project manager to give a high level design and the cost associated with the project.
- Wrote Bteq’s and JCLs to load data from flat files on Mainframe to Teradata Warehouse using BTEQ and MULTILOAD Utilities .
- Worked extensively with Flat Files and Fast Export utility.
- Created Parameterized Macros
- Worked with Visio to develop Flow Diagrams associated with the projects.
- Involved in Unit testing, smoke testing, system testing and integration testing and preparation of test cases.
- Worked in production teams to run jobs and troubleshoot production Issues/tickets.
- Designed and created various temporary and volatile tables as part of data transformation.
- Analyzed the sources system and documented the data quality issues.
- Did Performance tuning for TERADATA SQL statements
- Involved in migration of Jobs and associated Scripts to production and testing and setting up the respective environment
- Retrieved various Job outputs from $AVRS
- Involved in Code Reviews
Environment: Teradata 12 (FastLoad, MultiLoad, Fast Export, BTEQ and Queryman), Mainframes, DB2, Datacom, Erwin 4.0, VB and Windows 2000/NT.
Confidential, Durham, NC
Teradata Consultant
Responsibilities:
- Involved in the requirement analysis in support of Data Warehousing efforts.
- Extracted data from Oracle database transformed and loaded into Teradata target database according to the specifications
- Worked on PowerCenter Designer client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
- Created tables, views, procedures for the database using SQL and PL/SQL.
- Involved in importing Source/Target Tables from the respective databases and created Reusable Transformations and Mappings using Designer Tool set of Informatica.
- Worked extensively on complex mappings using different transformations like lookup, expression, filter, aggregators and others populate the Data Warehouse.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Created sessions and workflows to run in parallel, sequentially and nested.
- Developed Teradata Fastload scripts for initial loads, Mload scripts for data loads and BTEQ scripts for developing post load processes.
- Developed Teradata SQL scripts to verify referential integrity and to provide custom SQL support to the end user
- Developed UNIX shell scripts to run batch jobs in production.
- Involved in Unit Testing and Preparing test cases.
- Involved in Peer Reviews.
Environment: Informatica 8.6(Designer, Workflow manager, Workflow monitor and Repository Manager), Teradata 12 (FastLoad, MultiLoad, Fast Export, BTEQ and Queryman), UNIX, Oracle 10g, Erwin 4.0, Microstrategy and Windows 2000/NT.