We provide IT Staff Augmentation Services!

Senior Informatica/teradata Developer And Analyst Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • 9.5+ years of IT Experience in Software Development and Implementation of business applications.
  • Teradata 14 Certified Professional
  • Teradata 14 SQL Certified, Teradata 14 Certified Technical Specialist
  • 7.5 years of ETL experience in Implementation of Data Warehouse Projects using Teradata, Informatica, UNIX and Mainframe.
  • 6.5 years of experience in Banking/Financial Domain
  • Extensive experience in Project Planning, Effort estimation, Business analysis to meet needs of Clients and developing efficient solutions.
  • Good experience in design, enhancement, and development of Teradata data warehouse applications for OLTP, OLAP, and DSS.
  • Sound knowledge in Data Warehousing concepts - Bill Inmon / Ralph Kimball paradigm, Dimensional data modeling, Relational data modeling 3NF, Star and snow flake schemas.
  • Strong Experience in Data modeling using PowerDesigner - Conceptual Data Model, Logical Data Model and Physical Data Model, Reverse Engineering, Merge Models and Reports.
  • Sound Knowledge on Teradata Architecture (PE, AMPs, BYNET, Indexes, Data Distribution, Data Retrieval, Data Protection and Locking).
  • Excellent knowledge on Teradata SQL Performance tuning for optimal performance. Sound knowledge on Collect statistics, Derived tables, Volatile/Global Temporary tables, Join strategies, Join types and Explain/Optimizer plans.
  • Excellent Knowledge on writing Teradata Macros, Semantic Views, Stored Procedures, Triggers, Join index and Partitioned Primary Index. Developed Macros for Data validation and Attribute/Object Usage reports.
  • Excellent knowledge on Teradata Standards, best Practices and ANSI SQL.
  • Good knowledge on Compression features in Teradata (Multi value compression, Algorithmic, Block Level Compression).
  • Expertise in creating complex adhoc SQL queries to support user help requests within a short time frame using OLAP functions, derived tables and complex joins.
  • Expertise in Teradata utilities - TPT, FastLoad, MultiLoad (MLOAD), BTEQ, FastExport, TPUMP and ARCMain (Archive/Restore).
  • Good experience using Teradata Visual Explain, Teradata Administrator, Statistics Wizard, Index Wizard, View Point and TAS (Teradata Analytics for SAP).
  • Excellent knowledge in Implementing Mapping and Value added processes (VAP).
  • Good Data Warehousing ETL experience using Informatica PowerCenter Client tools - Power Center Designer - Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Mapplets, Workflow Manager, Workflow Monitor and Repository Manager.
  • Worked extensively with Slowly Changing Dimensions.
  • Good experience on Informatica Data Replicator (IDR).
  • Good experience on UNIX Shell Scripting and automation of ETL processes using UNIX scripting and Autosys.
  • Good Experience on Data Analysis on various databases - Teradata, Oracle, MS SQL Server.
  • Good Knowledge on ETL processing using SSIS
  • Good Experience in creating adhoc reports for business users using Tableau.
  • Good Working Experience with Banking/Financial, Telecom and Agriculture domain.
  • Excellent Knowledge on Customer Credit Bureau Processes for Bank Portfolio Management, different Scores (FICO and other Credit Bureau Proprietary scores) and Triggers.
  • Excellent Knowledge on Consumer loans and loan origination, servicing, guarantor and loss and recovery processes in banking.
  • Identified and modified the independent workflows to reusable/Common workflows and reduced the maintenance cost.
  • Good Working experience on Mainframe COBOL, DB2, JCL, REXX, EZTRIEVE, SYNCSORT, IEBGENER, IDCAMS, NDM, File-Aid, File Manager, CA1 Tape, VSAM, GDG, ChangeMan, Endevor, Autosys and CA7 Scheduling.
  • Experience in Full Cycle of Software Development (SDLC) including Requirement Analysis, System study, Design, Development, Unit testing, System Testing, Integration Testing, System Maintenance, Production Support and Documentation.
  • Excellent Team Player and lead a team of 15 offshore resources. Have strong verbal and written communication skills.
  • Good experience coordinating multi vendor team across different locations.
  • Possess strong analytical and problem solving skills. Fast learner.
  • Possess excellent presentation skills. Prepared presentations on Teradata architecture, DML, DDL, Basic and OLAP functions.

TECHNICAL SKILLS

Database systems : Teradata, Oracle, MS SQL Server, DB2

ETL Tools/Systems: Informatica, Mainframe, SSIS, UNIX

Teradata Utilities and Tools: TPT, MultiLoad (MLOAD), FastLoad, BTEQ, FastExport, TPUMP and ARCMain (Archive and Restore), Teradata SQL Assistant, Teradata Administrator, Visual Explain, Index Wizard, View Point, TAS (Teradata Analytics for SAP)

Informatica Tools: Informatica 9.1/9.5/9.6 Power Center tools - Power Center Designer, Workflow Manager, Workflow Monitor and Repository Manager, Informatica Administrator console, IDR (Informatica Data Replication)

Operating Systems : Windows, z/OS, UNIX

Scheduling Tool : CA7, Autosys, CRON

Version Control Tool: z/OS Changeman, z/OS Endevor and VSS

Data Modeling Tool: PowerDesigner

Languages: SQL, COBOL, JCL, Excel VB macro, REXX, EZTRIEVE

Other Client Tools: Visio, OpenProj, FTP, Mainframe File Aid, File Manager, CA1 Tape, NDM (Network Data Mover), ABARS, Putty, WinSCP, Maximo, Remedy, Tableau, Quality Center, MS SQL Server Management Studio, SSIS, Oracle SQL Developer.

PROFESSIONAL EXPERIENCE

Confidential

Senior Informatica/Teradata Developer and Analyst

Language: SQL, UNIX Shell Script

Database: Teradata, Oracle, MS SQL Server

Tools: Teradata Utilities (TPT, MLOAD, FASTLOAD, BTEQ, XPORT, TPUMP), Autosys, Oracle SQL Developer, Teradata SQL Assistant, UNIX, CRON, Putty, Visio, FTP, Informatica 9.5/9.6, Remedy, Informatica Data Replication (IDR), Teradata Administrator, View Point, Visual Explain, Index Wizard, TAS (Teradata Analytics for SAP), MS SQL Server Management Studio, SSIS.

Responsibilities:

  • Coordination with clients, Requirements Gathering and Impact Analysis
  • Design, develop, integrate and implement related application components including front-end and server-side implementation and database integration.
  • Participate in business JAD sessions, create application architecture documents.
  • Project Planning, estimation and resource allocation
  • ETL Processing using Informatica 9.5/9.6.
  • Create Mappings, transformations and Workflows involving ODBC, Oracle, TPT/Teradata, Relational, HTTP and File connections.
  • Informatica Power Center Client tools - Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Mapplets, Workflow Manager, Workflow Monitor and Repository Manager.
  • Scheduling of Informatica Workflows using Autosys Scheduler.
  • Create UNIX automation scripts for basic batch reporting, promotion and support activities.
  • Data Analysis on Teradata, Oracle and MS SQL server databases.
  • Performance Tuning of Teradata SQL queries.
  • Create/modify Teradata Tables, Views and Stored Procedures.
  • Coding and Unit Testing for Teradata SQLs, TPT, MLOAD, FastLoad, FastExport, TPUMP and BTEQ Script.
  • Create Secondary Index/Join Index to improve performance of Teradata Queries.
  • Use Teradata Administrator, Visual Explain, Index wizard, View Point.
  • Forecast database Space, Implement Multi Value Compression on Teradata tables to reclaim space
  • Debug Informatica ETL code for data issues reported by business.
  • ETL processing using Microsoft SSIS. Conversion of ETL process from SSIS to Informatica.
  • Create data validation controls and reports using Teradata Statistical and OLAP functions.
  • Use DBC Tables for the performance measurement, Space calculations, Teradata objects analysis and usage reports on attributes.
  • Work on complex adhoc queries to support user help requests within a short time frame.
  • Perform Rock and Roll for biggest model tables.
  • Upgrade of Informatica from 9.5 to 9.6 and consolidation of ETL servers - Planning, co-ordination and testing.
  • TAS (Teradata Analytics for SAP) and Informatica Data Replicator (IDR) to replicate data from Oracle to Teradata - Build, Test and Production Implementation.
  • Work with vendor (Informatica and Teradata) for issues and co-ordinate patch fix.
  • Monitor Teradata Performance and provide suggestions for Top impact queries.
  • Production Support – escalation point for production issues.
  • Manage a team of 7 offshore resources
  • Support and coach peers and juniors as and when required on specific technical competencies
  • Support Client interfacing activities and help create the relevant reports in a timely manner

Confidential, Charlotte

Technology Lead

Language: SQL, JCL, COBOL, REXX, EZTRIEVE

Database: Teradata

Tools: Teradata Utilities (TPT, MLOAD, FASTLOAD, BTEQ, XPORT, TPUMP), Changeman, Endevor, CA7, Autosys, File-Aid, File Manager, Teradata SQL Assistant, UNIX, Tectia – SSH, Maximo, ABARS, Visio, PowerDesigner, OpenProj, NDM (Network Data Mover), FTP, Informatica 9.1/9.6, Remedy, Tableau Desktop, Quality Center.

Responsibilities:

  • Worked with business to determine individual Project ETL/Data Warehouse requirements and convert them to Data Models and technical requirements.
  • Worked on Project Planning, Effort estimation and resource allocation.
  • Managed a team of 9 to 15 offshore resources over the past 4.5 years.
  • Worked in a challenging environment with multi-vendor development team, business team and testing team. Worked on Concurrent requests from small Enhancements to Major/new projects.
  • Actively participated in JAD sessions with Vendor team, Business and Technology Team.
  • Created high level or low-level design documents.
  • Worked on creating Conceptual, Logical and Physical Data Models for Score infrastructure and Traceability Control File projects and Microstrategy Reporting.
  • Implemented Slowly Changing Dimensions - Type 1, 2 and 3.
  • Worked on implementing the value added processes (VAP) using BTEQ. Worked on Coding and Unit Testing of Teradata SQLs, TPT, MLOAD, FastLoad, FastExport and BTEQ scripts.
  • Worked on creating UNIX shell scripts and automation of the ETL processes using UNIX shell scripting and Autosys.
  • Worked on creating ETL process for FairLending application using Informatica PowerCenter. Created Mappings and Mapplets to transform the data (File/Oracle/Teradata) according to the business rules.
  • Worked on Informatica PowerCenter Client tools – Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Mapplets, Workflow Manager, Workflow Monitor and Repository Manager.
  • Worked on various transformations like Source Qualifier, Joiner, Lookup, Router, Filter, Expression and Update Strategy. Implemented Slowly Changing Dimensions.
  • Worked on Metadata and Data Lineage for business elements.
  • Generated tableau dashboards with combination charts for clear understanding. Reports were published/ scheduled by COE team.
  • Worked on data archival and Restore using ARCMAIN utility.
  • Extensively used Changeman and Endevor for version control.
  • Worked on migrating data from SASPLEX to Teradata using UNIX scripts.
  • Worked on designing database objects for optimal performance.
  • Worked on CA7 Scheduling for Mainframe jobs.
  • Implemented Compression on Teradata tables and saved 31% of the total database space (per year).
  • Worked on creating parallel environment and extensive validation plan for upstream Platform Migration.
  • Performed Rock and Roll for biggest model tables.
  • Worked on automated validations to ensure data integrity between different production environments.
  • Extensively used DBC Tables for the performance measurement, Space calculations, Teradata objects analysis and usage reports on attributes.
  • Participated in rapid responses during production run and provided technical solutions for the business problems.
  • Trained new resources on both Technology and business.

Confidential, Charlotte

Technology Lead

Language: SQL, JCL, EZTRIEVE

Database: Teradata

Tools: Teradata Utilities (TPT, MLOAD, FASTLOAD, BTEQ, XPORT), Changeman, Endevor, SYNCSORT, CA7, File-Aid, File Manager, Teradata SQL Assistant, Maximo, ABARS, NDM (Network Data Mover), FTP.

Responsibilities:

  • Worked with business to gather requirements and created High Level design document.
  • Worked on Project Planning, Effort estimation and resource allocation.
  • Worked on EZTRIEVE programs to implement VAP for LEVER application which sources credit bureau data for business analysis.
  • Worked on Coding and Unit Testing of Teradata SQLs, TPT, MLOAD, FastLoad, FastExport and BTEQ scripts.
  • Worked on Performance Tuning of Poorly performing production queries.
  • Worked on CA7 Scheduling for Mainframe jobs.
  • Worked on data archival and Restore using ARCMAIN utility.
  • Worked on fork lifting data from Production Teradata tables to lower environment for testing in lower environment and Business User Parallel testing.
  • Extensively used Changeman and Endevor for version control.
  • Performed Rock and Roll for large tables.
  • Created Mainframe file back-up using ABARS.
  • Converted Value added process from EZTRIEVE programs to SQL in BTEQ.
  • Handled Production issues fixing data issues.

Confidential, Charlotte

Technology Analyst/Programmer Analyst

Language: SQL, JCL, COBOL, REXX, Excel VB Macro

Database: Teradata

Tools: Teradata Utilities (MLOAD, FASTLOAD, BTEQ, XPORT, TPUMP), UNIX, Changeman, Visio, CA7, File-Aid, Teradata SQL Assistant, ABARS, NDM (Network Data Map), Informatica 9.1, Autosys, Quality Center.

Responsibilities:

  • Prepared low level design documents, mapping documents and Unit Test Plans.
  • Developed Single Sourcing Data mart using Star Schema Fact and Dimension tables.
  • Created logical/physical Data Model diagrams using Visio for the Single Souring DataMart.
  • Worked on implementing the data mapping into ETL process.
  • Developed and tested Extract, Transformation, and Loading (ETL) modules using MultiLoad, TPUMP and BTEQ utilities to process the source files.
  • Implemented the complex mapping logic with optimal performance using OLAP functions.
  • Worked on Performance tuning of queries involving complex logic and processing large volume of data.
  • Worked on exporting the data from different hops into mainframe flat files with Teradata Fast Export and BTEQ.
  • Developed ETL solution using the tool sets from Teradata, Mainframe and UNIX to process huge volume of data.
  • Worked on creating ETL process for Wholesale Sourcing using Informatica PowerCenter. Created Mappings and Mapplets to transform the data according to the business rules.
  • Worked on various transformations like Source Qualifier, Joiner, Lookup, Sql, Router, Filter, Expression and Update Strategy. Implemented Slowly Changing Dimensions.
  • Provided knowledge transfer to more than 10 resources on Teradata to handle data warehousing projects.
  • Developed Performance Metrics Automation tool, a VB Macro tool to automate the Performance Metrics collection with the standards mandated by CAB.
  • Created Mass File Transfer tool for Mass File Transfer (NDM) between Production and test systems.
  • Created Mainframe file back-up using ABARS.
  • Created Dynamic Code generator tool, a REXX tool that converts data mapping to Mainframe MLOAD/BTEQ scripts.
  • Implemented a reusable tool to move data from one Teradata box to other using Teradata TPT utility.
  • Created Excel VB macro tool that connects to Teradata for the aggregated report data and create pre-defined format reports.
  • Created Automated Test Evidence tool, a REXX tool to save the required data from Job spool logs and write to PS files to save evidences for Unit testing.
  • Worked on documentation involved in each phase of the project. Created playbooks and operational manual for production support.
  • Created System Appreciation Document and User guide.

Confidential

Software Engineer

Language: SQL, JCL, COBOL, REXX, VSAM

Database: DB2

Tools: Changeman, CA7, File-Aid, SYNCSORT, XPED, VSS, VSAM

Responsibilities:

  • Created System Appreciation Document.
  • Created High Level Design and Detailed Design documents based on Client requirements.
  • Developed and tested the COBOL-DB2 and CICS-DB2 programs.
  • Worked on VSS to maintain the versions of components.
  • Maintained customer and their Service Order information in Master file (VSAM/PS) and DB2 tables.
  • Provided support for System integration testing and User acceptance testing.
  • Participated in Rapid Responses during production run and provided technical solutions for the business problems.
  • Worked on Customer queries and Mass Service Order processing for Private Lines.
  • Handled Implementation activities and post implementation support.
  • Automation of SOFE/PL processing for Disaster Recovery exercise.
  • Created Automated Regression Testing Tool, a REXX tool to execute base and test job stream, compare the data, create report with regression testing results.
  • Proposed and worked on converting 2-day processing of Private Line orders to one-day processing.

We'd love your feedback!