We provide IT Staff Augmentation Services!

Informatica/teradata Developer Resume

4.00/5 (Submit Your Rating)

Atlanta, GA

PROFESSIONAL SUMMARY:

  • 7+ years of experience in Information Technology with standing on Data Warehouse/Data Mart development using developing strategies for extraction, transformation, and loading (ETL) in Informatica Power Center/ Informatica cloud/ Informatica Power Exchange from various database sources.
  • Expertise in Teradata Database design, implementation, and maintenance mainly in large - scale Data Warehouse environments.
  • Experience working with Teradata Parallel Transporter (TPT), FastLoad, MultiLoad, BTEQ and Fast-Export, SQL Assistant, DDL and DML commands.
  • Involved in Informatica upgrade projects from one version to another version.
  • Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema (Fact Tables, Dimension Tables) used in relational, dimensional and multidimensional modeling.
  • Extensive experience in developing Stored Procedures, Functions, Triggers and Complex SQL queries.
  • Experience in Error Handling & Debugging and implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows in Informatica ETL mappings.
  • Created Unit Test cases and Unit Test results for Informatica Objects to be tested.
  • Expertise in Implementing CDC - Change Data Capture.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Expertise in Query Analyzing, performance tuning and testing.
  • Experience in writing UNIX shell scripts to support and automate the ETL process.
  • Strong experience in ETL Audit Process (ABC - Audit, Balance & Control) to make sure Data Quality & Integrity. Set up email alerts process with threshold if any data loading issues through ETL Process.
  • Experience in Data Extraction using Informatica Cloud with Salesforce as source
  • Technical expertise in ETL methodologies, Informatica Power Center, Power Mart, Client tools - Mapping Designer, Workflow Manager/Monitor and Server tools - Informatica Server Manager, Repository Server Manager, and Power Exchange.
  • Expertise in testing the applications and creating test cases for Unit Testing, Integration Testing and make sure the data extracted from source is loaded into target properly in the right format.
  • Use the Data Masking transformation to change sensitive production data to realistic test data for non-production environments.
  • Highly motivated and goal-oriented.
  • Implemented DWH Projects using Agile & Waterfall Methodologies.
  • Strong Understanding of Scrum Process.
  • Excellent verbal and communication skills clear understanding of business procedures and ability to work as an individual as well as part of a team.
  • Excellent Analytical and Problem-Solving Skills.

TECHNICAL SKILLS:

Databases: Teradata 12.0/13.x/14.10, SQL/MYSQL server and Oracle

Informatica Tools: Informatica Power Center 10.0.1/ 9.6.1/9.5.1 /9.1.1/8.6.1/8.1.1/7.1. , Informatica Cloud, power center exchange, IDQ GITHUB

Programming Skills: SQL, PL/SQL, Teradata SQL, Oracle SQL

Operating Systems: Windows, UNIX, LINUX

Teradata Utilities: BTEQ, SQL Assistant, Database Window, Fast Load, MultiLoad, Fast Export, TPT

Teradata DBA Utilities: VIEWPOINT, Query Monitor

Scheduling tool: Control-M Scheduler

Scripting Languages: UNIX Shell Scripting, BTEQ

PROFESSIONAL EXPIRIENCE:

Confidential, Atlanta, GA

Informatica/Teradata Developer

Responsibilities:

  • Used Informatica as ETL tool to pull data from source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities.
  • Involved in Requirement analysis in support of Data Warehousing efforts along with Business Analyst and working with the QA Team in Waterfall methodology.
  • Loaded data in to the Teradata tables using Teradata Utilities BTEQ, FastLoad, MultiLoad, and Fast Export and TPT.
  • Worked with various Informatica Power Center objects like Mappings, transformations, Mapplet, Workflows and Session Tasks.
  • Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, Mapplet Input and Mapplet Output transformations.
  • Extensively worked in the performance tuning of Teradata SQL, ETL and other processes to optimize session performance.
  • Involved in Initial loads, Incremental loads and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
  • Writing scripts in Python for scheduling tasks.
  • Writing Teradata Macros and used various Teradata analytic functions.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Successfully upgraded Informatica 9.6.1 to 10.1 and responsible for validating objects in new version of Informatica.
  • Knowledge on Data integration with Salesforce CRM using Informatica cloud
  • Extracted the raw data from Salesforce CRM to staging tables using Informatica Cloud.
  • Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
  • Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
  • Worked extensively with different Caches such as Index cache, Data cache and Lookup cache (Static, Dynamic and Persistence) while developing the Mappings.
  • Experience in using SVN as version control for migration.
  • Managed enhancements and coordinated with every release with in Informatica objects.
  • Provided support for the production department in handling the data warehouse.

Environment: InformaticaPower Center 10.1/ 9.6.1/9.5.1 , Informatica Cloud, Salesforce, Oracle 11g, DB2,Teradata15/14, Flat Files, Sql Assistant, TOAD, UNIX, Windows, SVN.

Confidential, Honolulu, HI

Informatica/ Teradata Developer

Responsibilities:

  • Involved in working with Business Analyst & Data Modelers.
  • Coded, tested, implemented and maintained medium to highly complex ETL mappings using Informatica.
  • Coded the Informatica jobs for long term reliability, and accuracy.
  • Worked with the existing application support team to help identify root cause of issues during the workflow runs, after identifying the issues took corrective actions and implemented long-term prevention plans.
  • Prepare/maintain documentation on all aspects of ETL processes to support knowledge transfer to other team members.
  • Worked extensively on different types of transformations like source qualifier, expression, Aggregator, Router, filter, update strategy, lookup, sorter, Normalizer, Data Masking, Union, Stored Procedure and sequence generator.
  • Used Data Masking transformation to secure the confidential claims data.
  • Very good knowledge and Hands on experience on Flat file processing and ability to develop data validation mappings during the flat file loads.
  • Ability to identify and implement the parallel processing capabilities in the workflow sessions and use those capabilities to reduce the Workflow load times.
  • Hands on experience on Performance Tuning using Session Partition, Lookup Properties, and Session Properties.
  • Converted Hive QL, Sqoop, Map reduce in Hadoop process to Informatica.
  • Extensive experience working with Big Data Edition to extracting the data from Hadoop systems
  • Developed complex mappings using Informatica Power Center Designer to transform and load the data from various source systems like Oracle 10g, Flat files, SQL Server, DB2 and to load data into Teradata.
  • Written the Shell Scripts based on the project requirements.
  • Used Teradata Utilities - BTEQ, MultiLOAD, FastLOAD & TPT for faster loads in to stage & Base.
  • Expertise is writing MultiLoad & Fastload scripting.
  • Involved in tuning Teradata BTEQs for recurring production failures.
  • Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions and Validations based on the design specifications.
  • Troubleshooted and Performance tuned complex ETL mappings to reduce the session run time.

Environment: Informatica Power Center 9.6.1/9.5.1 , Oracle 11g, DB2, XML, Flat Files, Teradata 14/12, Hadoop, Hive QL, Maestro, UNIX, Windows, Toad.

Confidential, Milwaukee, WI

Informatica/ Teradata Developer

Responsibilities:

  • Gathered requirements and created functional and technical design documents study, business rules, data mapping and workflows.
  • Responsible for creating Technical Design documents, Source to Target mapping documents and Test Case documents to reflect the ETL process.
  • Extensively worked with Teradata utilities likeBTEQ, Fast Export, FastLoad, MultiLoadto export and load data to/from different source systems including flat files.
  • Creatingdatabases, users, tables, triggers,macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Extracted data from various source systems like Oracle, SQL server and flat files as per the requirements.
  • Used extensively Teradata Analytic plans such as Teradata Visual Explain, Teradata Index Wizard and Teradata statistics wizard.
  • Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems
  • Implemented Data parallelism by using Multi-File System, Partition and De-partition components and preformed repartition to improve the overall performance.
  • Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables.
  • Performed application level DBA activities creating tables, indexes and monitored and tuned Teradata BTEQ scripts using Teradata Visual Explain utility.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
  • Flat files are loaded into empty tables using FastLoad and then used in the queries to do joins.
  • Scheduling the automated jobs for daily, weekly and monthly jobs using UNIX Shell scripts for Control-M scheduling.
  • Worked on different tasks in Workflows like sessions, Event raise, Event wait, Decision, E-mail, Command, Assignment, Timer, Worklets and scheduling of the workflow.
  • Designing and developing mapping, transformation logic and processes in Informatica for implementing business rules and standardization of source data.
  • Worked on UNIX shell scripting for automating Informatica workflows.
  • Developed Informatica mappings, Reusable transformations. Developed and wrote procedures for getting the data from the Source systems to the Staging and to Data Warehouse system.
  • Performance Tuning of the Informatica Mappings by adopting Explain plan, cutting down query costs using Oracle hints, changing the mapping designs.
  • Responsible to tune ETL procedures and STAR schemas to optimize load and query Performance.
  • Maintained and deployed the source code using GIT hub version control
  • Well versed with all stages ofSoftware Development Life Cycle (SDLC).

Environment: Teradata 1 .x/12.x, Informatica Power Center 9.1/9.5, SAS DI Studio 9.4, Oracle 10G/11G, UNIX, SQL Server 2012/2008, GITHUB and Windows, Control-M.

Confidential

Informatica/ Teradata Developer

Responsibilities:

  • Developed Informatica mappings for getting the data from the Source systems to the Staging and to Enterprise Data Warehouse system.
  • Extensively used transformations to implement the business logic such as Sequence Generator, Normalizer, Expression, Filter, Router, Rank, Aggregator, Look Up (Target as well as Source), Update Strategy, Source Qualifier and Joiner.
  • Designed complex mappings involving target load order and constraint-based loading.
  • Create/build and run/schedule Workflows and Worklets using the Workflow Manager.
  • Extensively worked in the performance tuning of the programs, ETL Procedures and processes.
  • Helped coding shell scripts for various administration activities for daily backup.
  • Performance Tuning of the Oracle & Teradata using explain.
  • Experienced in using Teradata utilities BTEQ, MultiLoad, FastLoad & FastExport.
  • Optimizing/Tuning mappings for better performance and efficiency, Creating and Running Batches and Sessions using the Workflow Manager, extensively used UNIX Shell scripts for conditional execution of the workflows.
  • Optimized the performance of Mappings, Workflows and Sessions by identifying and eliminating bottlenecks.
  • Performed Unit Testing Confidential development level, Source code migration and documentation
  • Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.
  • Involved in solving Trouble Tickets raised by Data Warehouse Users.
  • Experience with Prod on call support.

Environment: Informatica 8.6.1, Oracle 10g, Teradata, Flat Files, Flat Files, SQL Programming, Unix, Windows, MS Excel, SQL *Plus.

We'd love your feedback!