We provide IT Staff Augmentation Services!

Senior Etl Analyst Teradata, Informatica, Unix Resume

3.00/5 (Submit Your Rating)

Plano, TX

SUMMARY

  • 12 years of IT experience in Teradata Development, ETL, Requirements Gathering Data warehousing for clients in Banking, Health care and Retail domains.
  • 7+ years of experience in Banking domain and expertise in Retail banking concepts
  • 4 years of Expertise in Teradata FSLDM Concepts
  • Attend business meetings, gather requirements and convert business requirements to design documents
  • Design, Develop and review the code Teradata with Informatica Powercenter 9.5.1, 10.0 and, Datastage 11.5 as ETL on UNIX, create shell scripts, DIH coding.
  • Understand the data model, create source to Target mappings for the ETL process
  • Worked in Data and System migration projects Performing ETL operations with UNIX, Informatica and Mainframes in Teradata, Oracle 11g
  • Expertise in SDLC model project deliveries in traditional and Agile methodologies
  • Expert in creating Source to Target Mapping and Value added processes (VAP) in ETL Design
  • Expertise on Teradata Architecture (Indexes, Space, Locks, Data distribution and retrieval, Data protection)
  • Good knowledge on prescribing Indexes and usage of tables to be created in Staging, Target areas like Temporary, SET, Multi - Set, Volatile, global temporary tables
  • Worked on Teradata utilities - Fast load, Multi load (Mload), BTEQ, Fast Export.
  • Successful in projects that require Teradata SQL Performance tuning for optimal performance, Collect statistics, Join strategies, Join types and Explain/Optimizer plans That has OLAP functions, derived tables and complex joins.
  • Knowledge on various Indexes like Join Index, Multi Table JI, Single table JI
  • Good knowledge on Teradata Standards, best Practices and Ansi-SQL
  • Expertise in unit test cases, scenarios and validating unit test results
  • Write Unix Shell scripts, used functions like SED,AWK for file handling
  • Creating mapping, sessions and workflows in Informatica Power Center ETL, usage of different transformations and handled files (csv, xml), Database as source and target
  • Gather requirements, write design documents, engage in data analysis along with Business Analysts, Enterprise Architects
  • Perform scheduling and job execution in Autosys and WCC
  • Worked in Agile methodologies and track in Rally
  • Worked on Type, I II and III data model and perform data cleansing before commencing the ETL process for the data from various source systems
  • Responsible to support QA, UAT, Functional and Nonfunctional testing.
  • Possess strong analytical and problem solving skills and Fast learner to gain functional knowledge on the requirements.
  • Prepared presentations on Teradata architecture, DML, DDL, Basic and OLAP functions for training new resources.
  • Involve in review of design documents, on Teradata best practices and standards
  • Part of the Teradata interview panel

TECHNICAL SKILLS

Database and systems: Teradata and IBM Mainframe

Teradata Utilities: Mload, Fast Load, BTEQ, Fast Export and Tpump

ETL: Informatica Power Center 9.5.1, Data Integration Hub, Datastage 11.5

Oracle Utilities: SQLPLUS, SQLLDR

Scripting: UNIX Shell Scripting

Working Methedologies: Waterfall, Agile

Operating Systems: Windows, UNIX, IBM Mainframe

Scheduling Tool: CA7, ENDEVOR, Autosys, WCC

Test case tracking tools: HP Quality Center

Version Control Tool: Change man, VSS, Endevor

Languages: JCL, SQL, Shell Scripting

Other Client Tools: SQL Assistant, Visio, Mainframe- File Aid and, OPCA, Teradata Viewpoint, Db Visualizer

PROFESSIONAL EXPERIENCE

Confidential, Plano TX

Senior ETL Analyst Teradata, Informatica, Unix

Responsibilities:

  • Responsible to analyse the existing data and the data flow in EDW that holds the clients enterprise data built on Teradata 16
  • Analyse data present outside of EDW built on databases like Teradata, Oracle,SQL Server, Data Integration Hub etc that uses code in Mainframes, UNIX etc and provide ETL solutions using Teradata and Informatica Powercenter 10.1 to bring in to EDW
  • Attend Business meetings to gather requirements, create a design strategy, convert functional requirements to Technical design and write design documents
  • Write complex Teradata queries that include various join strategies to arrive at the design, create Unix shell scripts to handle files created on .csv, xml etc
  • Perform Data analysis before creating the design and perform data cleansing of files
  • Create source to target mappings before coding by understanding the Physical and Logical data model working along with Data modeler
  • Write Teradata components using utilities BTEQ, TPT, MLOAD, FLoad for the migration activity based on design
  • Create Informatica mapping, workflow, using various transformations for performing ETL aspects of the data migration
  • Analyse the data match and the tables across various databases to modify the source
  • Perform a detail level of testing on the developed queries, Tune it for performance and tune the under performing queries
  • Define an approach to perform the stats operations on the newly created and modified tables.
  • Calculate necessary space in GB for new incoming data and raise request to DBA
  • Implement performance tuning techniques and write advance sql queries in finding out the equivalent technical columns for the corresponding business terms
  • Perform scheduling using CTRL - M.

Confidential., Dearborn MI

Senior ETL Programmer Analyst Teradata, Mainframes, Datastage

Responsibilities:

  • Responsible to work in Enterprise data warehouse (EDW) and Analyse the jobs created in Mainframes, Teradata and other (Extract transform Load) ETL utilities like Datastage used in the client environment that access the EDW to bring new data and create reports on existing data
  • Perform migration activity on the components using (Job Control Language) JCLS, Parmcards, BTEQ, Fastload, MLoad
  • Understand the Logical and Physical data model, relationship of tables and Create mapping documents based on the analysis on existing jobs to create new components
  • Perform Proof of concepts on converting sqls into Datastage stages for enabling lineage.
  • Attend Business meetings, create a design strategy and convert functional requirements to Technical design
  • Interpret the various business terms and identify/mine their equivalent technical terms inside the datawarehouse built on Teradata
  • Write various SQL queries to perform updating the existing data based on the business requirements
  • Implement performance tuning techniques and write advance sql queries in finding out the equivalent technical columns for the corresponding business terms
  • Develop new components in Teradata, Mainframes, Design documents
  • Create Datastage components using various stages of Datastage
  • Perform Proof of Concept in converting complex sql codes into Datastage mappings
  • Tune the existing datastage code to perform better
  • Perform scheduling in Autosys, job monitoring in Autosys and WCC.
  • Perform support activities post the development as part of Agile methodologies.

Confidential, Plano TX

Teradata, Informatica Analyst and Developer

Responsibilities:

  • Working as a Teradata Analyst and Developer for a retail domain, responsible for gathering requirements, develop sql code, Unit testing, on Teradata Utilities MLOAD, BTEQ,
  • Attend business meetings, gather requirements and convert business requirements to design documents
  • Understand the data model, write source to Target mappings for the ETL process
  • Perform data quality checks based on the test data
  • Create Mappings, workflows in Informatica power center 9.5.1 and 10.1 and usage of different transformations
  • Write Shell scripts in handling csv, xml, txt files in Unix, use SED,AWK commands in file handling and writing trigger scripts
  • Involve in performance tuning of Teradata SQL Queries, creating mapping source to target mapping sheets, handling stats on the tables
  • Prescribe the type of tables to be created in Staging and Target areas like Temporary, SET, Multi-Set, Volatile, global temporary tables
  • Generate reports by writing ad-hoc sql queries
  • Responsible for creating and doing unit test, ensure performance tuning using Informatica Power Center 9.5.1 and 10.1
  • Develop new components in Informatica Data Integration Hub (DIH), a latest tool by Informatica and good understanding of DIH components, concepts
  • Teradata Utilities Fast load, Mload, Bteq, Tpump and responsible for Performance tuning
  • Worked in Oracle 11g, using load utilities like SQLPLUS, SQLLDR.

Environment: Teradata, Informatica Power center 9.5.1 and 10.1, Data Integration Hub (DIH), Sql Assistant, UNIX, Oracle, Db visualizer.

Confidential, Richardson, Dallas TX

Programmer Analyst - Teradata and ETL Analyst

Responsibilities:

  • ETL Technical Lead responsible to gather requirements, Design and assist develop and review the code on the health care projects in Teradata 13 with Informatica Power center 9.5.1 as ETL on UNIX operating system.
  • Perform data analysis alongside solution architects for requirements specifications
  • Create history files on an adhoc basis after the project deployment based on business requirements.
  • Responsible for creating design documents, source to target mappings as per functional requirements and presentation to client side Leads and SMEs for approvals
  • Provide QA support during ST,QAT and UAT
  • Perform release management process in Serena.
  • Good knowledge on Health care subject areas CLIENT, MEMBER, ACCOUNT, GROUP, DRUG, PRESCRIBER.
  • Create Teradata SQLs, components BTEQ, MLOAD, FLOAD and Informatica workflows, sessions and mapping based on data model and requirements.

Environment: Teradata, Informatica Power center 9.5.1, Sql Assistant, UNIX, Oracle, Design and documentation.

Confidential

Senior Software Engineer - Teradata and Enterprise GDW SME

Responsibilities:

  • As SME (Subject Matter Expert) for Bank’s data warehouse and Teradata lead, Responsible to successfully handle and deliver the projects on Retail portfolio.
  • In depth knowledge in various databases that holds each type of data in the warehouse
  • Working on Teradata 12, Mainframes, ETL and FSLDM
  • The core database of the ware house was built on Teradata FSLDM, Expertise in FSLDM on subject areas- AGREEMENT, PARTY, FEATURE,EVENT,PRODUCT
  • Knowledge on using different type of Indexes like Join Index, Multi Table JI, Single table JI
  • Good understanding over the upstream and downstream systems or applications that has data dependency on the warehouse
  • Perform requirements gathering by liaising with BA, EAD, other stakeholders and transform them into design document
  • Work with data modelers to map the data into the FSLDM based database
  • Responsible for performance tuning, space additions, scheduling etc. for handling new data
  • Worked in simplification projects on closure accounts, Money manager, Chip migration etc.
  • Worked for migration project that involved mortgage data of the bank.
  • Used Fast load, Fast Export, BTEQ scripts for handling ETL requirements.
  • Done multiple Impact Assessments IA's, before study. This will identify for any impacts to the platform and to give an estimate which has variance level of +- 50.

Environment: Teradata, TD Utilities, ETL, Performance Tuning, Mapping, Sql Assistant, Unix, Mainframes zOS, JCLs

Confidential

Senior Software Engineer - Teradata and GDW SME

Responsibilities:

  • Attend business workshops before commencing a new project or introducing new technology.
  • Gather requirements and Regularly attend business meetings representing the platform, provide updates to the CIO and necessary clarifications for EAD for any queries related to GDW data
  • Submit the design documents to the Bank’s technical team and get the approval on design to proceed for build and UT
  • Validate the development components on requirements and performance, platform standards
  • Support implementation of projects
  • Create design and develop sql components like BTEQ, Load utilities, DDLs in Teradata 12, JCLs in Mainframes
  • Ensure the performance is maintained on the coding using various parameters like Stats, joins, indexes, DDLs
  • Having good knowledge over the data in the warehouse, been a point of contact for the BA's in identifying the relevant data as per their requirement and generated one off extracts that do not follow SDLC using Teradata, Mainframes
  • Create data mapping for the new data into database and extracts from warehouse
  • Work with support team in case of any batch failures of production jobs

Environment: Teradata, TD Utilities, ETL, Mapping, Sql Assistant, Unix, Mainframes zOS, JCLs,Data Migration, Performance Tuning, Viewpoint, Write Design and Functional documents

We'd love your feedback!