We provide IT Staff Augmentation Services!

Teradata Developer Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • Over 10 + Years of IT Experience in development of Enterprise Data Warehouse applications using Informatica, Oracle and Teradata.
  • Experience in all phases of Data warehouse development from Requirements, analysis, design, development, testing and post production support.
  • Strong in - depth knowledge in doing data analysis, data quality and source system analysis.
  • Independent, Self-starter, enthusiastic team player with strong adaptability to new technologies.
  • Experience in Big Data Technologies using Hadoop, Sqoop, Pig and Hive.
  • Experience in writing Hive and Unix shell scripts.
  • Excellent track record in delivering quality software on time to meet the business priorities.
  • Developed Data Warehouse/Data Mart systems, using various RDBMS (Oracle, MS-SQL Server, Mainframes, Teradata and DB2).
  • Highly Proficient in using Informatica Power Center, Power Exchange and explore on Informatica Data Services.

TECHNICAL SKILLS

ETL Tools: Informatica, Data Stage, SSIS

Databases: Teradata 12/13/14, Oracle 9i/10g/11g/12c, MySQL, SQL Server 2000/2005, MS Access, DB2, Hadoop (HDFS)

GUI: .Net Custom development, Business Objects, Micro Strategy

Operating Systems: Windows, Unix, Linux

Languages: C#, VB Script, HTML, DHTML, Java Script, SQL, PL/SQL, Unix Shell, Python, Hive, Pig

Web Related: ASP.NET, VB Script, HTML, DHTML, JAVA, Java Script

Tools: & Utilities: Teradata Parallel Transporter, Aprimo 6.1/8.X, Bteq, SQL Assistant, Toad, SQL Navigator, SQL*Loader, $U, HP Quality center, PVCS, Data Flux, UC4, Control-M

Domain Knowledge: Banking, Finance, Insurances, Health Care, Energy

PROFESSIONAL EXPERIENCE

Teradata developer

Confidential

Responsibilities:

  • Worked with Confidential and united business users to capture data requirements and transformational rules between source systems with Albertsons EDW.
  • Writing python scripts to get the usage stats for all edge nodes.
  • Developed Mapping and design documents for Sales, Promotional and Marketing data.
  • Performed data profiling, source systems analysis to understand data and quality issues.
  • Developed BTEQ scripts to transform from Stage to 3rd NF and then to aggregate.
  • Tuned Complex Teradata queries to meet performance level agreements using statics, Indices, and Partitioning Techniques.
  • Multiload, Fast load, BTEQ, Created, modified databases, performed capacity planning.
  • Generating Flat files from Teradata 3NF tables using Teradata Fast Export utility, and then FTP them using shell script to a different UNIX server for the Application team’s consumption.

Environment: Teradata, Teradata Viewpoint, Teradata Studio, Unit, Python

ETL Developer

Confidential, Sacramento, CA

Responsibilities:

  • Source System Analysis and provide input to data modeling, and developing ETL design document as per business requirements.
  • Design, Developing and testing of the various Mappings and Mapplets, worklets and workflows involved in the ETL process.
  • Developed and Integrated Data Quality measures into ETL frame work using Informatica Data Quality ( IDQ ).

Environment: SQL Server, PL/SQL, MySQL, Informatica Power Center, Windows XP, UNIX .

Teradata Performance Engineer/ETL developer

Confidential, CA

Responsibilities:

  • Performance tuning, including collecting statistics, analyzing explains & determine which tables needed stats. Increased performance by 50-75% in some situations.
  • Develop complex SQL queries to identify the performance bottle necks in the processing.
  • Profound understanding of Banking Campaign management experience.
  • Working with Business analyst to develop productive models and advance SQL statements.
  • Multiload, BTEQ, created & modified databases, performed capacity planning.
  • Developed Sqoop Jobs to integrate Data from Oracle and Teradata for application migration.
  • Transformed data from STG Tables into Final Tables using Hive Scripts.
  • Developed PIG scripts for Analysts to analyze data on HDFS.
  • Writing python scripts to get the usage stats for all edge nodes.
  • From Hadoop used to refine and analyze clickstream data
  • Deliver new and complex high quality solutions to clients in response to varying business requirements and Creating and managing user accounts.
  • Refreshed the data by using fast export and fast load utilities.
  • Developed Informatica mappings for source to target loading from BODI to TP.
  • Worked on Aprimo Integration/Customization and configuration.
  • Source System Analysis and provide input to data modeling, and developing ETL design document as per business requirements.

Environment: Teradata, Teradata Viewpoint, Aprimo, Informatica Power Center,Python Pig, Hive, Oracle, PL/SQL, Windows, HP Quality center, Unix.

ETL and Teradata Developer

Confidential, CA

Responsibilities:

  • Analysis, Design, Development, Testing and Deployment of Informatica workflows, BTEQ scripts, Python and shell scripts.
  • Source System Analysis and provide input to data modeling, and developing ETL design document as per business requirements.
  • Design, Developing and testing of the various Mappings and Mapplets, worklets and workflows involved in the ETL process.
  • Developed and Integrated Data Quality measures into ETL frame work using Informatica Data Quality ( IDQ ).
  • Experience in data profiling using IDQ for input into ETL Design and Data Modelling.
  • Extensively used ETL to transfer data from different source system and load the data into the target DB.
  • Developing Informatica mappings with the collection of all Sources, Targets, and Transformations using Informatica Designer.
  • Extracting data from various sources across the organization (Oracle, MySQL, SQL Server and Flat files) and loading into staging area.

Environment: Teradata, Oracle, PL/SQL, MySQL, Informatica Power Center, Power Exchange, IDQ, OCL Tool, UC4, Control-M, ER Viewer, Business Intelligence, Windows, HP Quality center, Unix, Linux.

ETL Developer

Confidential, Annapolis, MD

Responsibilities:

  • Developed Low level mappings for Tables and columns from source to target systems.
  • Wrote and optimized Initial data load scripts using Information and Database utilities.
  • Using Partitions to extract data from source and load it to Teradata using TPT load with proper load balance on Teradata server.
  • Wrote Complex Bteq scripts to in corporate Business functionality in transforming the data from Staging into 3rd normal form.
  • Participated in Teradata Upgrade project to upgrade from TD12 to TD13.10 to conduct regression testing.

Environment: Teradata, Oracle, PL/SQL, MySQL, Informatica Power Center, SSIS, SSRS, ER Viewer, Windows, HP Quality center, UNIX.

We'd love your feedback!