We provide IT Staff Augmentation Services!

Etl Informatica/teradata Developer Resume

Cleveland, OH

SUMMARY

  • To Analyze, develop, improve and maintain Enterprise data warehouse with over 7 + years of strong experience in providing Business Intelligence solutions using ETL tool Informatica Power Center 10.x/9.x. and Performance Tuning, along with Teradata RDBMS (BTEQ, Fast load, Multiload, Stream and Fast Export).
  • Good knowledge of Data warehouse concepts and principles - IBM Basel-II, Star Schema, and Snowflake along with SCD 1,2 &3, CDC concept.
  • Excellent knowledge on modern development lifecycles including Agile, SDSL and Waterfall methodologies.
  • Well acquainted with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer.
  • Worked extensively with complex mappings using different transformations like Source Qualifier, Expression, Filter, Joiner, Router, Union, Unconnected / Connected Lookups and Aggregator.
  • Strong Experience in developing Sessions/tasks, Worklets, Workflows using Workflow Manager Tools - Task Developer, Workflow and Worklet Designer.
  • Performed ETL Development, Performance tuning, Peer reviews, Unit Testing, Test Cases for end-to-end delivery, environment setup and planning of projects deliverables
  • Experience to identifying Bottlenecks in ETL or existing Process and apply performance tuning on the production applications using PDO (Source, Target and Full), Partitioning, Indexing, Aggregate Tables, Load strategies, commit intervals and transformation tuning.
  • Good experience in documenting the ETL process flow for better maintenance and analyzing the process flow.
  • Teradata Studio, Teradata Utility(BTEQ, Fast load, multiload, Fast Export, Tpump), Teradata Data mover (TDM), Teradata multi value compression(MVC)
  • Also have very good experience with Teradata database tuning with help of Teradata Data Mover, Stats collections, join index, Primary and Secondary indexes, multi value compression (MVC) at DDL level, Staging to main table data processing along with Issue debugging and SQL tuning
  • Along with scrum development, I have 3+ years of coordination experience with offshore /onshore, Data Analyst and Data Modular team for requirement gathering, Issue fixings, design process, analysis of new/existing source
  • Use VersionOne and Jira project management tool for track and maintain records of all phases in Agile from backlog to report creation, development stories and tasks
  • Attending Daily scrum, weekly and Bi-weekly status meeting of development with scrum team, Project Lead, Project Manager, Data Analyst, Data Modular and Testing team for status and delivery for requirement and planning activities
  • Provide work estimation for change request with deliverables timelines
  • Work on Change Request with ServiceNow with deployment tools (IBM uDeploy, Artifactory, GIT, Harvest) for end to end delivery along with signoff activity from TCoE(Testing Team) and ETL Architect for performance improvements of ETL or SQL. Ability to meet tight deadlines with my quick learning, status reporting, cooperativeness and hardworking strength
  • Performed ad-hoc requirements and POC’s for any new initiative/Project, I performed POC on migration of tool like ETL Informatica power center 9.6/10.2 and environment upgrade. New platform setup for application, Migration project oracle to Teradata and new structure design for Linux ETL server. Also, I representative as a SPOC of my project, for helping or coordination with other application. Helped to setup IBM Infosphere data lineage web-based tool for other application.
  • Performed re-structured Linux/Unix scripts for greater directory management for Enterprise Datawarehouse where we generating thousands of files frequently at serer, created generic scripts and files that will reduced work for different environment like DEV,SIT UDV and PROD
  • Good experience in UNIX/Linux shell scripting, FTP, SFTP and file management in various UNIX/LINUX environments. Directory/environment and connection setup for application
  • Good experience in setup job scheduling with CA Workload Automation CA7 and Autosys tool.
  • Preparing for AWS solution architect in cloud computing certification along with DATA mining in statistics with R and Python programming language

TECHNICAL SKILLS

Development Tool: Informatica Power center 10.x/9.x, Informatica Power Exchange 10.x/9.x, Teradata Studio, Teradata Utility(Fast load, multiload, Fast Export, Tpump), Teradata Data mover, Teradata multi value compression(MVC), Toad for Oracle, CA Workload Automation CA7 Edition, MS Office, Autosys scheduler, HP ALM/Quality Center, Web Client, Putty and WinSCP

Agile: VersionOne and Jira is a project management tool, Methodology Scrum

Code Deployment Tool: IBM urban uDeploy, Artifactory and CA Harvest Software Change Manager

Database: Teradata, Teradata Data Mover, Teradata multi value compression MVC), Teradata TPT Utilities (BTEQ, Fast load, multiload, Fast Export, Tpump) and Basics of Oracle

Analysis Tool: Teradata Viewpoint, IGC Lineage Tool

Programming Languages: SQL, Basics of PL/SQL, Basics of java

Scripting Languages: Basics of UNIX/LINUX, Shell Scripting and Commands.

PROFESSIONAL EXPERIENCE

ETL Informatica/Teradata Developer

Confidential, Cleveland, OH

Responsibilities:

  • Requirement gathering and analysis for new source system and change in existing with Data Analyst and Data modular
  • Developed required code and use VersionOne project management tool for track and maintain records of all phases in Agile from backlog to report creation, development stories & task
  • Attending Daily weekly and Bi-weekly status meeting on development with Project Lead, Project Manager and Client. Daily and weekly status meeting with offshore/onshore team, Data Analyst, Data Modular and Testing team for project status and delivery for requirement.
  • Resolving issues relating to technical and functional components and escalating matters requiring immediate attention to the immediate supervisors. Estimation analysis of Code design development and deliverables.
  • Developed ETL code with Teradata SQL and Teradata TPT utility (BTEQ, Fast load, Multiload, Stream and Fast Export) and design new/modify existing for process for requirements
  • Performed Informatica and Teradata performance tuning for optimum performance. Use tuning at mapping, session and workflow level with the help of PDO, partitioning, indexing and session parallelism, also create reusable transformations and Mapplets are built wherever redundancy is needed, used most of the join and calculation at SQ SQL for overcome IS overhead
  • On database tuning, Performed Teradata database tuning with help of Teradata Data Mover, Stats collections and its automation, Object List building for tables, join index, multi value compression (MVC) at DDL level, Staging to main table data processing. Parallel load with different Teradata user on same time for load distribution and TPT session allocations, Create multiple Teradata service user for improve performance
  • Code review of all the ETLs, Unit test case documents, Design, IQA, EQA documents, completed by team with proper review check list. Defect prevention and causal analysis activities.
  • Development and enrichment of Compliance documentation for existing and new application components.
  • Check review and setup ETL scheduling with CA7 automation as per source and consumer requirements and perform developed code deployment with the help of uDeploy and Artifactory and prepare CR(Change request) for production setup and coordinate implementation with application support team
  • Production implementation activities before production deployment with all scheduling, deliverables components and documents. Production support turnover meeting and CAB representation for changes with release.
  • Provide assistance to Support Team for any Production issue/incident.
  • Knowledge transfer of existing application to team members and conduct meetings for same
  • Proof of concept (POC) for Ad-hoc client’s requirements

Environment: Informatica Power Center 10.1, Teradata, Teradata TPT utility, Performance tuning, Unit test case documents, Design, IQA, EQA documents, Code design, POC, DDL level. uDeploy, Artifactory, VersionOne, Putty/WinScp, Teradata data mover, CA7 scheduling, IGC data Lineage, Agile, Scrum, ALM tool for defect scheduling

ETL Informatica/Teradata Developer

Confidential, Cleveland, OH

Responsibilities:

  • Attending Daily weekly and Bi-weekly status meeting with Project Lead, Project Manager and Client for ongoing release and activity. Team meeting with team member and assign task for development activity
  • Environments setup for all Development, SIT and UDV environments and perform all performances validation for new environments for improvement and report accordingly and take an appropriate action for improve the same
  • Setup Linux structure create new Linux directory for batter performance and update all CA7 jobs and ETL code as per new server
  • Create automated scripts and ETL for change code and CA7 jobs as per new environments
  • Code review, Unit test case documents, Design, IQA, EQA documents, completed by team with proper review check list as per new environment
  • Along with migration activity performed regular release activity for continue deployments of new requirements
  • Developed ETL code with Teradata SQL and Teradata utility (BTEQ, Fast load, Multiload, Stream and Fast Export). Also design new or modify existing process for release change
  • Performing Informatica tuning for mappings and sessions for optimum performance along with PDO, Teradata load utilities fast load, multiload, fast export and tpump for data conversion, Reusable transformations and Mapplets are built wherever redundancy is needed.
  • Defect prevention and causal analysis activities. System and business knowledge build.
  • Development and enrichment of Compliance documentation for existing and new application components. Knowledge transfer of existing application to team members.
  • Performance tuning of long running queries of existing applications.
  • Proof of concept (POC) for Ad-hoc client’s requirements like GFB migration and setup
  • Coordination with onshore, ETL Admin and ETL architect on daily and weekly meeting.
  • Setup activity for Informatica and Linux server with create firewall rules, Teradata, oracle wallet, connectors and folder as per new GFB guideline in all environment UAT, QA and PROD
  • End-to-end testing and validation in all environment for new setup and connections
  • Developed new ETL code and scripts for validation and testing of new setup connection and Linux directories
  • Created new configuration files and re-created all Linux directories for easy access in production by Support team
  • Created and setup new repositories for multiple parallel requirements

Environment: UAT, QA and PROD, ETL code and scripts, GFB guideline, Linux server, firewall rules, Teradata, oracle & Teradata wallet, data conversion, Teradata SQL and Teradata TPT utility. Informatica 9.5/9.6,udeploy, ALM tool for defect scheduling, automated environment Test ETLs

ETL Informatica/Teradata Developer

Confidential, Cleveland, OH

Responsibilities:

  • Attend daily meeting with my team and platform team for daily activity
  • Provide daily status to my team lead and client for daily activity
  • Convert all metadata, ETL, Unix scripts as per Teradata environments
  • Compare and recreate DDL structure as per Teradata environments and keywords
  • Developed pseudo one to one ETL for oracle to Teradata for data validation across the environment.
  • Convert oracle based ETL into Teradata database based code with required performances tuning, with Teradata TPT utility, stats definitions and Multi value compression at DDL level
  • Developed ETL and do Unit Testing of ETL code, Defect Fixing and deliver code to Testing Team for validation. Create Unit test case, IQA, EQA documents for customer.
  • Perform parallel execution of production for three months for end to end monitoring and performances enhancement at Teradata side
  • Create Unix script for improving load process and change requirement for platform oracle to Teradata.
  • ETL/Linux/DDL code deployment into higher environment for IST and UDV data loads for regression testing
  • We create a team and worked on bottleneck of code which migrated from oracle to Teradata and Performed performance tuning on it. Based on type of source data type of load process relational flat file, mainframe etc. And we used Teradata TPT (Fast load, multiload, Fast Export, Tpump), PDO and session partitioning and parallel execution of code, Flat file as a staging area.

Environment: ETL/Linux/DDL code, IST and UDV data loads, Unix script, ETL code, Defect Fixing, Teradata TPT. ALM, Toad for Oracle, Informatica 9.5, Teradata, Harvest, Autosys, DML for metadata.

Hire Now