We provide IT Staff Augmentation Services!

Teradata Developer Resume

3.00/5 (Submit Your Rating)

Richardson, TX

SUMMARY

  • 12 years of experience in Information Technology with a strong back ground in Analyzing, Designing, Developing, Testing, and Implementing of Data Warehouse development in various domains such as Banking, Health Care.
  • 9 years of Data Warehousing experience using Informatics Power Center Client 10.x/9.x/8.x/7.x
  • 3+ years of Experience on Talend Open studio for Big data(6.1,6.4).
  • Expertise in creating bigdata batch jobs in TALEND (using Talend Open Studio 6.1/6.4) using tMap, tJoin, tReplicate, tParallelize, tAggregate, tSortRow, tLogCatcher, tRowGenerator, tSetGlobalVar, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tFilter, tDie etc
  • Involved in code migration projects from Informatica/Abinitio to Talend on Hadoop Cluster.
  • Strong hands on experience using Teradata utilities (BTEQ, Fast Load, Multi Load, Fast Export, TPT)
  • Proficient in Teradata query and application tuning/optimization.
  • Hands on experience using query tools like TOAD, Oracle SQL Developer and Teradata SQL Assistant.
  • Extensive testing ETL experience using Informatica PowerCenter Designer, Workflow Manager, Workflow Monitor.
  • Follow the agile best practices in the daily scrum calls, Jira board and actively participate in Scrum Events, including Sprint planning, Daily scrum, Sprint Reviews, Backlog grooming/review and Retrospective. Timely resolve or update the jira board stories to ensure transparency and effective communication for the better velocity, burndown charts.
  • Worked on Spark SQL and Data frames for faster execution of Hive queries using Spark SqlContext
  • Performed ETL process with Spark using Scala for processing and validation of raw data logs.
  • Performed data processing in Spark by handling multiple data repositories / data sources.
  • Created RDDs in Spark using Spark Context and used Scala APIs to read multiple data formats.
  • Experience in converting business process into RDD transformations using Apache Spark and Scala.
  • Experienced in using Apache Spark with Scala and Spark SQL.
  • Experienced in complete life cycle Implementation of data warehouse.
  • Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts, Star and Snowflake schema addressing Slowly Changing Dimensions (SCDs)
  • Expert knowledge in Trouble shooting and performance tuning at various levels such as source, mappings, target and sessions.
  • Used informatica power center 10.1/9.6/9.0.1 to Extract, Transform and Load data into Data Warehouse from various sources like Oracle, Ebcidic, xml, csv and flat files.
  • Coordinate for planning, Testing and Implementation for Informatica 9.1 to 9.6 and 9.6 to 10.1 migrations and Teradata server upgradations
  • Worked with Teradata database to implement data cleanup, performance tuning techniques.
  • Experience in the concepts of building Fact Tables, Dimensional Tables, handling Slowly Changing Dimensions and Surrogate Keys.
  • Experienced with Multiple Databases including Oracle 12, Teradata 15/14/13/12.
  • Worked on Teradata utilities BTEQ, FLOAD, FEXPORT, MLOAD and TPUMP.
  • Expertise in designing and Developing complex Mappings using Informatica PowerCenter Transformations - Lookup, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, Sequence Generator and Slowly Changing Dimensions.
  • Smooth coordination, managed team meetings, transition between customer to the current support team and among the onsite-offshore teams.
  • Coordinate for Connectivity all access related activities queue, server, tools, and network issues during the initial engagement set up.
  • Working experience in using Informatica Workflow Manager to create Sessions, Batches and schedule workflows and Worklets, Developing complex Mapplets, Worklets, Re-usable Tasks, Re-usable Mappings, Define Workflows &Tasks, Monitoring Sessions, Export & Import Mappings & Workflows, Backup & Recovery, Power Exchange.
  • Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
  • Extensively used Control-M (8/7/6 versions) for scheduling the UNIX shell scripts and Informatica workflows.
  • Coordinate Onsite and offshore team, managing various calls related to requirements, development, testing and migration.
  • Proficiency and Expertise in SQL Backup/Recovery, Disaster recovery and planning.
  • Experience in dealing with various databases like Oracle, Teradata, DB2Excel sheets, Flat Files and XML files.
  • Well Experienced in doing Error Handling and Troubleshooting using various log files.
  • Good exposure to development, testing, debugging, implementation, documentation and production support.
  • Experience in handling initial and incremental loads in target database using mapping variables.
  • Strong understanding of Data warehouse concepts, ETL, Star Schema, Snowflake, data modeling experience using Normalization, Business Process Analysis, Reengineering, Dimensional Data modeling, FACT& dimensions tables, physical & logical data modeling.
  • Developed effective working relationships with client team to understand support requirements, develop tactical and strategic plans to implement technology solutions, and effectively manage client expectation.
  • Excellent written, communication skills and possessed analytic problem skills in evaluating business and technical processes and issues, develops and implements system enhancements to facilitate overall operations.
  • Support for pre and post Production Implementation activities.

TECHNICAL SKILLS

DW BI tools: Informatica Power Center 10.1/9.6.1/8.6/8.1.1 , Talend open Studio (6.1/6.4)

Database: Oracle 10x/9x, DB2, Sybase, Teradata

Hadoop Arch: Hadoop file system, Spark, Scala,Hive

Languages: PL/SQL, UNIX Shell Scripting

Other tools: TOAD, SqlAssistant, Clarity Tool, Jira, AIM, Control-M, WinScp, putty,Zena, SailpointIdentityIQ,Github Enterprise Version 2.17,UrbanCodeDeploy,Jenkins,Zena

PROFESSIONAL EXPERIENCE

Confidential - Richardson, TX

Teradata Developer

Responsibilities:

  • Design the ETL jobs applying the business rules provided by business team using company defined best practices.
  • Develop Jobs/scripts for extracting,transforming,loading data into Teradata DataWarehouse tables from source systems such as DB2 database tables and sequential files,Delimited files
  • Analysis, Design, Development, and Implementation of ETL process using various Teradata utilities like TPT, BTEQ, FLOAD, MLOAD, TPUMP and FEXPORT
  • Create shell scripts for File Pre-processing(Duplicate records check, trimming the record length,removing the special characters),archiving,FTP the files to remote servers and generate the Parm Files for executing the jobs with different parameters in different environments.
  • Create BTEQ scripts to execute insert,update and delete queries,procedures for data transformation and loading into the tables with logfile creation for specific session
  • MLAOD,FastLoad,FastExport scripts to extract and load data from FlatFiles to Teradata tables and vice versa.
  • Extensively used Teradata Parallel Transport(TPT) utility to load data into tables and extract data as per the business logic
  • Provide the Production Support during the Client Warranty Period, and also Troubleshooting, Performance improvement and Production issues resolution.
  • Ensure the Unit Test Cases and System Test Plans are execute before the code migrated to higher environments.
  • Coordinate with the End User/Business Team for the User acceptance Testing and Approvals for every Enhancement in a Release.
  • Provide the code Cutover and help with the data availability for System and UAT testing for the QA Team.
  • Scheduling of jobs with Zena Scheduler to create the process to run the jobs in sequence with Event,File and Calendar Triggers.
  • Contribute towards Fine Tuning,Trouble shooting,Bug Fixing,Defect Analysis,Enhancement requirements in the code
  • Creating BTEQ Scripts to load data from landing zone to Work tables as per the mapping documentation.
  • Preparation of mapping document and specifying the data lineage specifications to the developers.
  • Performance tuning of the Complex sql queries or long running jobs.
  • Creation of User stories and tasks in Jira.Refinement, Restrospective, Sprint Planning meets for the 2 week Sprints
  • Populating data into Sematic layers after applying multiple business transformations.
  • TPT scripts to load data from various source systems.
  • Conducted code review sessions to validate development standards.
  • Gathering the Business Requirements from Business group and converting into as Technical specifications.
  • Unit testing in UNIX environment and performance turning based on explain plain.
  • Writing BTEQ Scripts to load Initial and Incremental for all the Static and Dynamic data loading using BTEQ.
  • Creation of FastExport scripts to create Reports for the Business Users to access.
  • Files upload into SFTP servers to third party vendors with Encryption and decryption technique.
  • Performance optimization of costly SQL.

Environment: Teratadata SQL,Teradata SQL Assistant, Putty,Zena,Github,Jenkins,UC4

Confidential - Denver, CO

ETL Informatica Developer

Responsibilities:

  • Confidential is at the forefront of innovation to address the entire breadth of clients’ opportunities in the evolving world of cloud, digital and platforms.
  • This project will Integrate the Invesco and OppenHeimers SalesForce Cloud data to the existing Integration model data.
  • The Sales data of both the firms in integrated for the digital marketing Analysis.
  • The analysis is made based on Contacts,Accounts,Tasks,Events,Activity data.
  • Extensively worked with the data modelers to implement logical and physical data modeling tocreate an enterprise level data warehousing.
  • Extensively used Informatica Power Center 10.1 to extract data from various sources andand load in to staging database.
  • Worked on writing PL/SQL procedures for data load into Oracle database.
  • Extensively worked on complex and simple data transformation logics in Staging, Integrationand Semantic layers.
  • Extensively worked with Informatica Tools - Source Analyzer, Warehouse Designer,Transformation developer, Mapplet Designer, Mapping Designer, Repository manager,
  • Workflow Manager, Workflow Monitor, Power Exchange, Repository server and Informaticaserver to load data from flat files, legacy data.
  • Extensively used transformations such as Source Qualifier, Aggregator, Expression, Lookup,Router, Filter, Update Strategy, Joiner, Transaction Control and Stored Procedure.
  • Handled critical issues such as data masking of sensitive information
  • Designed the mappings between sources (external files and databases) to operational stagingtargets.Involved in data cleansing, mapping transformations and loading activities.
  • Developed Informatica mappings and Mapplets and tuned them for Optimum performance, Dependencies and Batch Design.
  • Scheduled the full backup, differential, transaction log backup of Database
  • Proficient in writing SQL queries and creating PL/SQL stored procedures.
  • Involved in creating and modifying UNIX shell scripts and scheduling the UNIX scriptsthrough Autosys
  • Experience with Extraction, Transformation and Load (ETL) tools such as Informatica PowerCenter.
  • Used Informatica debugging techniques to debug the mappings and used session log files andbad files to trace errors occurred while loading.Designing mapping templates to specify high-level approach.
  • Created Informatica mappings with PL/SQL procedures/functions to build business rules to loaddata.
  • Extensively worked on unit testing and implemented on transformations, mappings.
  • Created Test cases and detailed documentation for Unit Test, System, Integration Test and UATto check the data quality.
  • Extensively worked with various Passive transformations like Expression, Lookup, SequenceGenerator, and Experience in working with Access management, Sales, Order management andfulfilling system of publishing company.
  • Documented all old and migrated Store Procedures and scripts for future references.
  • Coordinated between Development, QA and production migration teams.
  • Coordinate Onsite and offshore team, managing various calls related to requirements,development, testing and migration and leading the team.
  • Reports were generated using Business Objects for analysis.
  • Outstanding knowledge of leading application server configurations services and capabilities
  • Experience as Agile team member and being back up for scrum master.
  • Hands on Experience on using Tortoise SVN, CA Service Manager,Jira
  • Environment: Informatica Power Center 10.1,Oracle 12c,Autosys,SVN,SalesForce Cloud,Oracle SQL Developer,Kanban Board,

Confidential

Application Support Analyst

Responsibilities:

  • Extensively worked with the data modelers to implement logical and physical data modeling to create an enterprise level data warehousing.
  • Extensively used Informatica Power Center 10.1/9.6/9.5.1 to extract data from various sources and load in to staging database.
  • Designed and Developed UNIX Shell Scripts for Data manipulations and Data Conversions.
  • Worked with pre and post sessions, and extracted data from Transaction System into Staging Area.
  • Extensively worked on complex and simple data transformation logics in Staging, Integration and Semantic layers.
  • Extensively worked with Informatica Tools - Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Power Exchange, Repository server and Informatica server to load data from flat files, legacy data.
  • Extensively used transformations such as Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner, Transaction Control and Stored Procedure.
  • Handled critical issues such as data masking of sensitive information
  • Designed the mappings between sources (external files and databases) to operational staging targets.
  • Involved in data cleansing, mapping transformations and loading activities.
  • Developed Informatica mappings and Mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
  • Scheduled the full backup, differential, transaction log backup of Database
  • Proficient in writing SQL queries and creating PL/SQL stored procedures.
  • Involved in creating and modifying UNIX Korn shell scripts and scheduling the UNIX scripts through Control-M.
  • Experience with Extraction, Transformation and Load (ETL) tools such as Informatica Power Center, AbinitionGDE, Talend Studio.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Experience in integration of various data sources like xml, ebcidic, SQL Server, Oracle, Teradata, Flat files, DB2 Mainframes into the staging area
  • Implemented Slowly Changing Dimension methodology for Historical data.
  • Designing mapping templates to specify high-level approach.
  • Extensive hands on XML import/export, deployment groups, query generation, migration using Informatica repository manager.
  • Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.
  • Extensively worked on unit testing and implemented on transformations, mappings.
  • Created Test cases and detailed documentation for Unit Test, System, Integration Test and UAT to check the data quality.
  • Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, and Experience in working with Access management, Sales, Order management and fulfilling system of publishing company.
  • Documented all old and migrated Store Procedures and scripts for future references.
  • Coordinated between Development, QA and production migration teams.
  • Coordinate Onsite and offshore team, managing various calls related to requirements, development, testing and migration and leading the team.
  • Reports were generated using Business Objects for analysis.
  • Outstanding knowledge of leading application server configurations services and capabilities
  • Experience as Agile team member and being back up for scrum master.
  • Hands on Experience on using Tortoise SVN, AIM, ClarityTool, Jira.
  • Reports were generated using Business Objects, Tableau, Cognos for analysis.

Environment: Informatica Power Center 10.1/9.6/9.1 , Abinitio GDE 3.2.7, Hadoop file system, Teradata 15/14, Erwin, SQL Assistant, Control M, Informatica/ETL Developer

Confidential

Application Support Analyst

Responsibilities:

  • Worked on Talend components like transformation, file processing, java components, Big data batch jobs, spark sql related and logging components.
  • Extensively used TalendBigData components like tHiveCreateTable, tHiveRow, thdfsinput, thdfsoutput, tHiveload, thdfscopy, thdfsdelete, thdfslist, thdfsconnection, thiveconnection, tHiveClose, tHdfsget, tHdfsPut, tHdfsproperties, tHdfscompare, tHdfsrename, tHbaseInput, tHbsaeOutput, tSqoopExport, tSqoopImport.
  • Used various Talend components like tFilterRow, tMap, tJoin, tPreJob, tPostJob, tFileList, tSplitRow, tAddCRCRow, tJava, tAggregateRow, tDie, tWarn, tLogRow, etc
  • Leverage code pull/merge capabilities from Talend Studio with code Configuration of bit bucket repositories with Git with Talend Administration Center.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Configured the hive tables to load the profitability system in Talend ETL Repository and create the Hadoop connection for HDFS cluster in Talend ETL repository.
  • Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.

Environment: Informatica Power Center 10.1/9.6/9.1 , Abinitio GDE 3.2.7, Hadoop file system, Teradata 15/14, SQL Assistant, Talend Open Studio 6.1/6.4,Hive

Confidential

Application Support Analyst

Responsibilities:

  • Extensively worked with the data modelers to implement logical and physical data modeling to create an enterprise level data warehousing.
  • Extensively used Informatica Power Center 10.1/9.6/9.5.1 to extract data from various sources and load in to staging database.
  • Designed and Developed UNIX Shell Scripts for Data manipulations and Data Conversions.
  • Worked with pre and post sessions, and extracted data from Transaction System into Staging Area.
  • Extensively worked on complex and simple data transformation logics in Staging, Integration and Semantic layers.
  • Extensively worked with Informatica Tools - Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Power Exchange, Repository server and Informatica server to load data from flat files, legacy data.
  • Extensively used transformations such as Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner, Transaction Control and Stored Procedure.
  • Handled critical issues such as data masking of sensitive information
  • Designed the mappings between sources (external files and databases) to operational staging targets.
  • Involved in data cleansing, mapping transformations and loading activities.
  • Developed Informatica mappings and Mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
  • Scheduled the full backup, differential, transaction log backup of Database
  • Proficient in writing SQL queries and creating PL/SQL stored procedures.
  • Involved in creating and modifying UNIX Korn shell scripts and scheduling the UNIX scripts through Control-M.
  • Experience with Extraction, Transformation and Load (ETL) tools such as Informatica Power Center, AbinitionGDE, Talend Studio.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Experience in integration of various data sources like xml, ebcidic, SQL Server, Oracle, Teradata, Flat files, DB2 Mainframes into the staging area
  • Implemented Slowly Changing Dimension methodology for Historical data.
  • Designing mapping templates to specify high-level approach.
  • Extensive hands on XML import/export, deployment groups, query generation, migration using Informatica repository manager.
  • Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.
  • Extensively worked on unit testing and implemented on transformations, mappings.
  • Created Test cases and detailed documentation for Unit Test, System, Integration Test and UAT to check the data quality.
  • Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, and Experience in working with Access management, Sales, Order management and fulfilling system of publishing company.
  • Documented all old and migrated Store Procedures and scripts for future references.
  • Coordinated between Development, QA and production migration teams.
  • Coordinate Onsite and offshore team, managing various calls related to requirements, development, testing and migration and leading the team.
  • Reports were generated using Business Objects for analysis.
  • Outstanding knowledge of leading application server configurations services and capabilities
  • Experience as Agile team member and being back up for scrum master.
  • Hands on Experience on using Tortoise SVN, AIM, ClarityTool, Jira.
  • Reports were generated using Business Objects, Tableau, Cognos, Sas for analysis.

Environment: Informatica Power Center 10.1/9.6/9.1 , Abinitio GDE 3.2.7, Hadoop file system, Teradata 15/14, Erwin, SQL Assistant, Control M, Informatica/ETL Developer

Confidential

ETL Developer/Lead

Responsibilities:

  • Extensively worked with the data modelers to implement logical and physical data modeling to create an enterprise level data warehousing.
  • Extensively used Informatica Power Center 10.1/9.6/9.5.1 to extract data from various sources and load in to staging database.
  • Designed and Developed UNIX Shell Scripts for Data manipulations and Data Conversions.
  • Worked with pre and post sessions, and extracted data from Transaction System into Staging Area.
  • Extensively worked on complex and simple data transformation logics in Staging, Integration and Semantic layers.
  • Extensively worked with Informatica Tools - Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Power Exchange, Repository server and Informatica server to load data from flat files, legacy data.
  • Extensively used transformations such as Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner, Transaction Control and Stored Procedure.
  • Handled critical issues such as data masking of sensitive information
  • Designed the mappings between sources (external files and databases) to operational staging targets.
  • Involved in data cleansing, mapping transformations and loading activities.
  • Developed Informatica mappings and Mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
  • Scheduled the full backup, differential, transaction log backup of Database
  • Proficient in writing SQL queries and creating PL/SQL stored procedures.
  • Involved in creating and modifying UNIX Korn shell scripts and scheduling the UNIX scripts through Control-M.
  • Experience with Extraction, Transformation and Load (ETL) tools such as Informatica Power Center, AbinitionGDE, Talend Studio.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Experience in integration of various data sources like xml, ebcidic, SQL Server, Oracle, Teradata, Flat files, DB2 Mainframes into the staging area
  • Implemented Slowly Changing Dimension methodology for Historical data.
  • Designing mapping templates to specify high-level approach.
  • Extensive hands on XML import/export, deployment groups, query generation, migration using Informatica repository manager.
  • Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.
  • Extensively worked on unit testing and implemented on transformations, mappings.
  • Created Test cases and detailed documentation for Unit Test, System, Integration Test and UAT to check the data quality.
  • Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, and Experience in working with Access management, Sales, Order management and fulfilling system of publishing company.
  • Reports were generated using Business Objects, Tableau, Cognos, Sas for analysis.

Environment: Informatica Power Center 10.1/9.6/9.1 , Abinitio GDE 3.2.7, Hadoo p file system, Teradata 15/14, Erwin, SQL Assistant, Control M, Informatica/ETL Developer

We'd love your feedback!