We provide IT Staff Augmentation Services!

Sr Etl Informatica/teradata Developer Resume

0/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • Over 8+ years of experience in IT industry with core competency in development in Business Intelligence(BI) with technical skills of Hadoop (MapReduce, Hive, Pig), Informatica, Teradata, SQL and other client tools and strong knowledge in data warehousing.
  • Designed, developed, implemented and maintained Informatica Power Center and Informatica IDQ 9.5.1 application for matching and merging process.
  • Utilized of Informatica IDQ 9.5.1to complete initialdata profiling and matching/removing duplicate data.
  • Expertisein Data Quality validation, Data Profiling and Data analysis in Enterprise Data Warehouse environment
  • Strong ability to execute multiple Data Quality and Data Governance projects simultaneously and deliver within in timelines.
  • Good understanding in entity relationship and Data Models.
  • Worked Extensively on Informatica Power Center, Informatica IDQ 9.5.1, Teradata, Hadoop Framework and Unix scripts.
  • Worked Extensively in implementing map reduce programs, hive and pig scripts
  • Experience in creating Mapplets and Rules and using of rules in Profiling.
  • Worked in the Master Build team to create the SQL scripts and Kintana packages for the deployment in the weekly, monthly and quarterly releases.
  • Experience in writing Teradata BTEQ, Fload, Mload scripts and automated Unix scripts to deploy Teradata queries in production
  • Experience in Data Migration projects and worked extensively in designing the MDF tool which is Metadata Driven Framework.
  • Worked on designing Complex scheduling jobs using $U (Dollar U) tool.
  • Experience in creating packages using Kintana and source control like PVCS.
  • Experience in implementing Excel Services and Dash boardsfor Reports.
  • Flourish in both independent and collaborative work environments with quick learning abilities and excellent communication skills, presentation skills.

AREAS OF EXPERTISE

  • Data Integration, Data Quality
  • Computer Programming
  • Hadoop Framework
  • SQL/PLSQL
  • ETL (Extract Transform Load)

TECHNICAL SKILLS

Programming Languages: SQL, PL/SQL, C, C++, XML, HTML

Operating Systems: UNIX, Windows, Sun Solaris

Tools: Erwin, TOAD, SQL Navigator, SQL Developer, Teradata SQL Assistant SQL Loader, ER Studio

Frameworks: Hadoop (MapReduce, Pig, Hive, HDFS)

Database: Teradata V2R6, Teradata 12.0, Oracle

BI Tools: Cognos

ETL Tools: Informatica IDQ 9.51, Informatica Analyst tool, Data Integration

Scripting Languages: Unix(Shell)

Scheduling Tools: Autosys, $U, Event Engines

Repository Tools: PVCS, CVS. IDN Portal

PROFESSIONAL EXPERIENCE

Confidential, Chicago, IL

Sr ETL Informatica/Teradata Developer

Responsibilities:

  • Involved in analysis of the FEEDs from the mainframe system by taking the dml of the FEED.
  • Profiling the data present in Oracle by using the structure of the FEED and creating mapping documents of the dml for creating objects in Teradata Database.
  • Created the Profiling reports to be useful for the Informatica Analysts.
  • Created Physical data objects to load the data from one physical data object to other.
  • Worked on different transformations like Labeller, Standardizer, Parser, Address Validator, match and merge process.
  • Worked on creating the referenced data in the Analyst tool which will get from the approval from the Data Analysts.
  • Working with different strategies on removing of duplicate data by using the match transformation, consolidated transformation and key generator transformation.
  • Experience in end to end Data quality testing and support in enterprise warehouse environment
  • Experience in maintaining Data Quality, Data consistency and Data accuracy for Data Quality projects
  • Creating mapping documents to match the source and Target layouts and sending the mapping documents for the LDM and PDM to start over.
  • Worked on Push down optimization and calling Unix scripts from the ETL tool.
  • Worked with different designer tools like Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Mapplet Designer.
  • Involved in creating Tasks, Sessions, Workflows and Worklet by using the Workflow Manager tools like Task Developer, Worklet and Workflow Designer.
  • Creating the required objects like Tables, Primary Indexes, Views (Semantic Views, Business Views, Replic Views) and Creating Secondary indexes.
  • Creating Grant and Revoke scripts to issue the permission on the tables.
  • Written BTEQ, Fload, Mload Teradata scripts to load the data from Unix landing zone by creating jobs.
  • Done Query optimization by creating stats and by checking the confidence levels in the Explain plan.
  • Involved in writing the automated Unix scripts and Wrote PL/SQL Procedures, Triggers according to the requirement
  • Created the Development and production deployment build scripts by using the IDN portal and CVS to check in and check out the code developed and unit tested.
  • Created Map reduce programs to clean and write transformations on the target side of HDFS
  • Written Pig and Hive scripts to analyse huge data
  • Unit tested the migrated data which should match the source and Target.
  • Developed the scheduling jobs in Development and production environments.
  • Created the packages for development and production deployment in IDN portal and Kintana.
  • Documented the whole development process and trained users.
  • Experience with maintaining, trouble shooting and system related problems.

Environment: InformaticaPower Center8.x/9.x,IDQ 8.x/9.x, Oracle 11g/10g,Teradata, Toad, Unix Shell Script, SQL Server Management Studio, CVS

Confidential, Stamford, CT

Sr. Informatica Developer

Responsibilities:

  • Involved in requirements gathering and client interaction.
  • Implemented Profiling by using Informatica IDQ before extracting the data from Orcale.
  • Getting the approvals from the business users regarding the PDM and LDM models according to the requirement
  • Cleansing the data using Standadizer, Parser, Case Converter, Key Generator, Expression and Filter transformations.
  • Creating the mapplets and rules, applying those rules in the mappings and while profiling.
  • Exporting the profiled and send it to the business analysts for the approval of data models.
  • Creating Reference data by the data given by the business analysts.
  • Creating mappings and loading the data to Teradata environment by using those mappings.
  • Worked with address validator on different templates to validate the address and checked for mailability score and match score.
  • Work with Flat Files and relational sources Oracle and worked with Informatica Scheduler.
  • Worked on Push down optimization and calling Unix scripts from the ETL tool.
  • Worked with different designer tools like Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Mapplet Designer.
  • Involved in creating Tasks, Sessions, Workflows and Worklet by using the Workflow Manager tools like Task Developer, Worklet and Workflow Designer.
  • Written BTEQ, Fload, Mload, TPUMP Teradata scripts to load data from UNIX landing zone.
  • Developed metadata using the MDF tool and loading the data to MySQL data base wrote complex transformation logics which involves look up transformation, Sequence Generator, Source Qualifier, Router Transformation, Filter Expression
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Created the Development and production deployment build scripts
  • Created Map reduce programs to clean and write transformations on the target side of HDFS
  • Written Pig and Hive scripts to analyse huge data
  • Developed multiple workflows for scheduling the jobs.
  • Created cron jobs and scheduling the map reduce and loading Unix scripts on client requirement.
  • Developed the scheduling jobs in Development and productions environments.
  • Created the packages for development and production deployment by Kintana.
  • Documented the whole development process and trained users.
  • Experience with maintaining, trouble shooting and system related problems.

Environment: Informatica Power Center 9.1/9.5, IDQ, Dollar Universe, ER/Studio Enterprise 8.5, WinSCP 5.1.4, Putty, LINUX, Oracle 11g/10g, Toad10.5, SQL Developer, Quality Center Unix, CVS

Confidential, Chicago, IL

Sr. Informatica Developer

Responsibilities:

  • Done the requirement analysis and data profiling for the source side(ORACLE) according to the requirement
  • Created score cards for the profiled data which need to be analyzed by the analysts.
  • Getting the approvals from the business users regarding the PDM and LDM models according to the requirement
  • Cleansing the data using Standadizer, Parser, Case Converter, Key Generator, Expression and Filter transformations.
  • Creating the mapplets and rules, applying those rules in the mappings and while profiling.
  • Exporting the profiled and send it to the business analysts for the approval of data models.
  • Working with different strategies on removing of duplicate data by using the match transformation, consolidated transformation and key generator transformation.
  • Creating Reference data by the data given by the business analysts.
  • Creating mappings and loading the data to Teradata environment by using those mappings.
  • Worked with address validator on different templates to validate the address and checked for mailability score and match score
  • Creating and maintaining database objects like tables, views, materialized views, indexes, sequences, synonyms in Teradata and loading data using the utilities
  • Developed design documents and mapping design documents for the code and developed ETLS
  • Changed already existing mappings and added new attributes.
  • Wrote PL/SQL Procedures, Triggers according to the requirement
  • Involved in writing the automated Unix scripts
  • Created the Development and production deployment build scripts
  • Created Map reduce programs to clean and write transformations on the target side of HDFS
  • Written Pig and Hive scripts to analyse huge data

Environment: HDFS, MapReduce, Pig, Hive, Teradata, PL/SQL, Unix, Informatica IDQ, $U (Scheduling tool), PVCS (Check - in tools)

Confidential, San Jose, CA

Informatica Developer

Responsibilities:

  • Involved in development of the required ETLS for the data flow from some ERPs and Oracle data base into target HDFS Environment and Teradata database and Metadata in MySQL to load the data to HDFS by using different transformations like SQ, Router, expression, aggregator, lookup, and other transformations by Informatica
  • Creating and maintaining database objects like tables, views, materialized views, indexes, sequences, synonyms in Teradata and developing UNIX Shell Scripts for loading data using the utilities
  • Involved in development of Business logic (TV logic) SQL query that used to load the history and current data into the Target Teradata database tables
  • Involved in the development of all the scripts regarding the table structures and views and the corresponding business views
  • Involved in migrating the developed code from development environment to the testing environment
  • Wrote PL/SQL Procedures, Triggers according to the requirement
  • Involved in basic testing by coordinating with the QA folks
  • Involved in writing the automated Unix scripts
  • Created the Development and production deployment build scripts
  • Created Map reduce programs to clean and write transformations on the target side of HDFS
  • Written Pig and Hive scripts to analyse huge data.

Environment: HDFS, Map reduce, Pig, Hive, Teradata, PL/SQL, Unix, Informatica IDQ, Kintana, $U Scheduling tool

Confidential

Software Developer

Responsibilities:

  • Involved in development of the required ETLS for the data flow from some ERPs and Oracle data base into target Teradata Database and Metadata in MySQL to load the data to HDFS by using different transformations like SQ, Router, expression, aggrerator, lookup, and other transformations by Informatica creating and maintaining database objects like tables, views, materialized views, indexes, sequences, synonyms in Teradata and developing UNIX Shell Scripts for loading data using the utilities
  • Involved in development of Business logic (TV logic) SQL query that used to load the history and current data into the Target Teradata database tables
  • Involved in the development of all the scripts regarding the table structures and views and the corresponding business views
  • Involved in migrating the developed code from development environment to the testing environment
  • Wrote PL/SQL Procedures, Triggers according to the requirement
  • Involved in basic testing by coordinating with the QA folks
  • Created the Development and production deployment build scripts
  • Created Map reduce programs to clean and write transformations on the target side of HDFS
  • Written Pig and Hive scripts to analyse huge data.

Environment: HDFS, Map Reduce, Pig, Hive, Teradata, PL/SQL, Unix, Informatica, Kintana, $U Scheduling tool

We'd love your feedback!