We provide IT Staff Augmentation Services!

Sr Teradata Developer Resume

3.00/5 (Submit Your Rating)

Temple Terrace, FL

SUMMARY:

  • 10+ Years of Experience in Data warehouse Projects involving various phases of IT projects which include planning and designing, gathering business requirements, deployments, analysis, development, testing, data modeling, data governance and implementation encompassing both Waterfall and AGILE SDLC methodologies.
  • Analysis work which frequently includes importing, cleaning, transforming, validating or modeling data with the purpose of understanding or making conclusions from the data for decision making purposes.
  • Experience in Data Mart, Data Warehousing, ETL, SQL, Teradata, Teradata utilities (BTEQ, FastLoad, MultiLoad, FastExport, Queryman) and UNIX Shell scripting.
  • Strong working experience on Data Warehousing applications, directly responsible for the Transformation, Extraction and Loading data of multiple formats into Data Warehouse.
  • Onshore Team lead for Business Intelligence, Integration, data sourcing and transformation with Development/Sustainment projects in Confidential and Verizon.
  • Managing day to day interactions with client teams and counterparts ; communicate clearly and consistently on components objectives and activities; work closely with client to understand engagement changes implications and facilitate implementation
  • Extensive experience in Extraction, Transformation, Loading (ETL) data from various Sources into Data Warehouses and Data Marts using Informatica Power Center.
  • Capability to investigate and identify root cause of complex production issues and support production procedures.
  • Median knowledge on Big Data Technologies like Hadoop, Hive and Map Reduce.
  • Conducting reviews of project deliverables. Monitoring and controlling the quality of the deliverables.
  • Involved in Acceptance Testing for all Development and Linux projects
  • Involved in the development of the business rules and logic in order to harmonize the data.
  • Partner with data quality and project management teams throughout the project lifecycle.
  • Desire and commitment to continuous learning in Teradata, Big Data, analytics and Business intelligence.
  • Possess working experience in Telecom and Banking domains.
  • Involved in model changes for TD tables, correcting data for specific posting dates, sweeps as part of Change Management, performance and optimization.
  • Experience on Data migration/extraction from different sources, handling failure scenarios, trouble shooting.
  • Supporting SLA & non SLA applications in production.
  • Innovative in approach, enjoys adapting new methodologies and ideas, putting them into daily practice.

TECHNICAL SKILLS:

Databases & Tools: Teradata 15, SQL Assistant, Informatica 8.x.x and 9.1, Oracle 9i/10g, Unix Shell Scripting, Hive and Sqoop (both in learning stage)

Methodologies: Waterfall and Agile

Technologies worked on: TWS(Tivoli work scheduler), Actuate Reporting tool, Tortoise SVN, SCCS(versioning tool), SQL Loader, Quality Center, JIRA, Maximo, BMC Remedy, SQL Toad

Operating System: Unix, Linux, Microsoft Windows (server, 2000, XP, 7,10)

Programming Languages: SQL, C

PROFESSIONAL EXPERIENCE:

Confidential, Temple Terrace, FL

Sr Teradata Developer

Responsibilities:

  • Played Senior ETL consultant role from the Requirement phase.
  • Organized multiple requirement discussion meetings with Business Users and Source system experts and formed functional and technical specs
  • Major part of the project involves migration from oracle to teradata. This includes migration of code, migration of data, documenting the migration process, extensive testing and validation.
  • Handling VDSI (Offshore) for development activities.
  • Developed a standard ETL framework to enable the reusability of similar logic across the board.
  • Used Bteq, Mload and Fastload for loading data from Multiple Sources into Teradata tables and Fastexport to export Target data which will be trasnsfered to target systems using CFI.
  • Performed Database/ETL migrations from Dev. environment to Test/Training/UAT/Staging and PROD environments.
  • Involvement in implementation of BTEQ and Bulk load jobs.
  • Created BTEQ scripts to extract data from warehouse for downstream applications.
  • Exporting Data using BTEQ and Fast Export.
  • Used database objects like views, Partitioning for accomplishing the Complex logical situations.
  • End to end development of Data Warehouse.
  • Prepared the test execution plan and tracking the execution plan.
  • Extensively writing Teradata Coding with multiple SQL statement scenarios in Dev with Prod Environment install.
  • Preparing Mloads, Bteq’s, FastExports, and Fastloads for loading and modifying data according to the requirements
  • Tuning and optimization of SQL/Teradata Queries, Change request management.

Environment: Teradata, UNIX Shell Scripting, Linux, SCCS (Versioning)

Confidential, Richardson, TX

Sr Teradata Developer

Responsibilities:

  • Importing, cleaning, transforming, validating or modeling data with the purpose of understanding or making conclusions from the data for decision making purposes.
  • Involved in Requirement analysis and Designing with Sr. Technical Architects.
  • Communicate clearly and consistently on components objectives and activities; Business Intelligence work closely with client to understand engagement changes implications and facilitate implementation
  • Transfer the files by using NDM Connect: Direct to end users based on the requirement.
  • Worked on ET Extract and Loading by using different utilities.
  • Working with client’s Directors and Managers on a daily basis inorder to reduce end user engagement changes or issues.
  • Involved in developing a data ingest roadmap and staying on timelines.
  • Worked on Teradata stored procedures, macros and functions
  • Data management b/w source and target by using Golden Gate process.
  • Worked on different Informatica transformations to Extract data, Transformation and load to Stage, Flat files and Target tables.
  • Involved in business units to implement nationalized best in class business logic and to process data in stage, journal, base and semantic within Teradata and Informatica Works.
  • Write, test and implement Teradata Fastload, Multiload and Bteq scripts, DML and DDL
  • Strong experience writing BTEQ scripts to transform data.
  • Used Golden Gate for data pumps to sync up the data with target database.
  • Worked in Data Extraction, Transformation and Loading from stage to target system using BTEQ, Fast Load and MultiLoad.
  • Involved in preparing DG and CR deployments by communicating with offshore teams, with supporting all types of testing.
  • Compose/modify/delete TWS schedules. Create Onrequest schedules and modify as per business plan to run daily/monthly/weekly.
  • Developing/Modifying scripts, unit testing and ST support.
  • Involved in documentation like Runbook, DG, Deployment readiness package, ST Handover documents.
  • Delivery of the monthly status reports and conducts regular meetings with the customer’s management team.
  • ST support for environment creation and transferring files in QA environment.
  • Involved in New/CR deployments and post deployment support/validations.

Environment: Informatica 8.6, (Oracle), Teradata, TWS, Oracle, Linux and UNIX

Confidential

Teradata Developer

Responsibilities:

  • Analyzing the given documents, understanding and implementing it.
  • Integrate the day to day transactions in Enterprise data warehousing.
  • Worked with various scripts (BTEQ, Fastload and Mload) by creating and manipulating them.
  • Understanding the issues and proposing resolutions.
  • Understand the requirements of the clients and prepare the documents with the listing of various sources and targets.
  • Developing Stored Procedures and functions.
  • Identified the different sources from which the data is coming as feed to The Warehouse.
  • Creating Unit test cases and implemented unit test results.
  • Performance tuning, SQL query enhancements, code enhancements to achieve performance targets using Explain Plans.
  • Worked in Data Extraction, Transformation and Loading from stage to target system using BTEQ, Fast Load and MultiLoad.
  • Delivery of the monthly status reports and conducts regular meetings with the customer’s management team.
  • Involved in team meetings - updates and project event discussions.
  • ST support for environment creation and transferring files in QA environment.
  • Creating Temporary table’s global and volatile temporary tables in the Bteq scripts for efficient perm and spool space utilization as well as to divide complex logic queries into simple queries for efficiency.

Environment: Informatica 8.6, (Oracle), Teradata, TWS, Oracle, Linux and UNIX

Confidential

Lead Teradata Developer

Responsibilities:

  • ECR SIS (short Interview schedule) call reviews (ECR Deployments)
  • SIS call reviews (weekend deployments)
  • CRs/ECRs/Batch only CR's deployments in production.
  • Informatica Folder creations, password updates.
  • Informatica and Teradata code deployment in production.
  • Resolving failure with appropriate solutions.
  • Created Folders, User and Roles
  • AOTS trouble tickets resolution.
  • Webtrax WR tickets resolution.
  • DW BID Collection tables update and reporting activities

Environment: Informatica Power center 8.5, Teradata, Oracle, UNIX

Confidential

ETL Developer

Responsibilities:
  • Developed Reports by using Actuate Reporting tool to generate the reports for users.
  • Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Informatica.
  • Create mappings to build business rules to load data.
  • Various transformations (SQ, lookup, Router, aggregators, filters, SG and UPS) were used to handle situations depending upon the requirement.
  • Customizing the Code with respect to the new Enhancements.
  • Acted as Offshore Team Lead to update the status of the issues to the clients on daily basis.
  • Assigning the tasks to offshore team and ensure the completion
  • Creating/Modifying reports as per user requirements by using Actuate Reporting tool.
  • Actively participated in gathering User Requirements from users.
  • Completing PL/SQL Development/Enhancements on weekly basis.
  • Creating Database objects Views, functions and developing procedures for implementing.
  • Preparing the scripts in order to load or update the data to database.
  • Development of oracle objects based like tables, views, stored procedures, packages, global temporary tables, indexes on tables.

Environment: Informatica 8.x, PL/SQL, Actuate Reporting tool

Confidential

ETL Developer

Responsibilities:

  • ETL Development based on the architect HLD plan.
  • Identify various sources from which the data is input to the Warehouse.
  • Extract the data from different interface systems or sources and transform/ map the data to the target format and load into ‘The W’.
  • Apply different Unit test cases with all conceivable conditions to identify the bugs and ensure the loaded data as defect-free.
  • Distribute the data so that it can be easily accessed by the end-users and make business decisions.
  • Implement security features that restrict unauthorized user access to data in the Warehouse
  • Support and performance improvement of the existing Data warehouse applications.

Environment: Informatica 7.x, 8.x, Oracle 10g SQL, Autosys and IRIS

Confidential

ETL Developer

Responsibilities:

  • Resolving Service Calls, Problem tickets and User queries. Granting and Revoking permissions to end users. (in RightNow Tool)
  • Involved in the peer review and Quality related activity at various phases like Unit testing, Impact analysis, resolving the UAT.
  • Creating user accounts and extending access to users.
  • Component Integration Testing to ensure the Individual Component/Module is working when integrated with Other Application Component/Module.
  • Analyzing the Business Requirements and System Specifications to understand the application.

Environment: RightNow (CRM Tool), Putty, Oracle 10g

We'd love your feedback!