We provide IT Staff Augmentation Services!

Senior Informatica/ Teradata Developer Resume

4.00/5 (Submit Your Rating)

Lafayette, LA

SUMMARY

  • 7+ years of experience in ETL process using Informatica Power Center 10.1 and higher.
  • Good knowledge of Data warehouse concepts, Informatica MDM and principles Star Schema, Snowflake, SCD, Surrogate keys, Normalization/ Denormalization.
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server, DB2, Teradata, Netezza and Hadoop and Worked on integrating data from flat files like fixed width and delimited.
  • Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center.
  • Well acquainted with Informatica Designer Components - Source Analyzer, Target Designer, Transformation Developer, Mapplet Designer and Mapping Designer.
  • Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Sorter, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups, Normalizers, Update Strategy and Aggregators.
  • Strong Experience in developing Sessions/tasks, Worklets, Workflows using Workflow Manager Tools - Task Developer, Worklet Designer and Workflow Designer.
  • Hands on experience with Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling / IDQ Developer.
  • Worked on Informatica partitioning for performance improvement.
  • Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments.
  • Extensively used Informatica Repository Manager and Workflow Monitor.
  • Extensively used Metadata Manager to do impact analysis and to find data lineage.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Hands on experience in Performance Tuning of sources, targets, transformations and sessions.
  • Worked on TDM Test Data Manager tool for data discovery, data validation, data masking for de-identification and testing.
  • Good experience in documenting the ETL process flow for better maintenance and analyzing the process flow.
  • Worked with Stored Procedures, Triggers, Cursors, Indexes and Functions.
  • Worked on UNIX Shell scripting, developed UNIX scripts using PMCMD command.
  • Worked on scheduling ETL jobs using schedulers like Autosys, Active Batch and Control-M.
  • Strong Knowledge in optimizing database SQL queries.
  • Experience of Genesis Copybook (VSAM Copybooks), on how to extract, transform and load data into data warehouse. Experience of working in Mainframe DB2 database environment and FTP’d files to Mainframe.
  • Experience of handling VSAM Files.
  • Experience in Hana DB2 Modeling tool.
  • Migrating mappings/workflows to upper environments like test region and production region using Repository Manager.
  • Worked on monitoring, troubleshooting and restarting batch processes using Informatica Power Center.
  • Demonstrated ability to work and communicate effectively with both business and technical audiences.
  • Highly motivated to take independent responsibility as well as ability to contribute and be a productive team member.

TECHNICAL SKILLS

  • ETL Tools: Informatica Power Center 9.6, 10.1, 10.2, Informatica Data Quality(IDQ), MDM Hub, Metadata Manager, Talend, OWB(Oracle Warehouse Builder)
  • Databases: Oracle 11g, SQL Server, DB2, Mainframe, Netezza, Teradata, Hadoop
  • GUI Tools: Toad, SQL Developer, SQL* PLUS, SQL Server Management Studio and Aginity Workbench for Netezza, TDM Test Data Management, Teradata SQL Assistant, Hana DB2 Modeling.
  • Files: VSAM Files, Genesis Copybooks, FTP Files, XML & XSD XML
  • Scheduling Tools: Informatica Scheduler, Autosys, Control - M, CA7-Passport
  • Scripting Languages: UNIX Shell Scripting
  • Operating Systems: Windows XP Professional/7, LINUX, UNIX
  • Database Tuning: SQL Tuning, Explain Plan, Table Partitioning, Materialized views, Analytical Views, Hints.
  • Agile Tools: Version One, JIRA, Rally, TFS

PROFESSIONAL EXPERIENCE

Senior Informatica/ Teradata Developer

Confidential, Lafayette, LA

Responsibilities:

  • Designed the ETL processes using Informatica to load data from Teradata and Flat Files (Fixed Width) to staging database and from staging to the target Oracle Data Warehouse database.
  • Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
  • Designed and developed the logic for handling slowly changing dimension tables load by flagging the record using update strategy for populating the desired.
  • Involved in cleansing and extraction of data and defined quality process for the warehouse.
  • Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
  • Designed and developed the logic for handling slowly changing dimension tables by creating SCD Type 1 and Type 2 mappings and load by flagging the record using update strategy for populating the desired.
  • Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.
  • Used Debugger in troubleshooting the existing mappings
  • Developed UNIX shell scripts to control the process flow for Informatica workflows to handle high volume data.
  • Worked on writing SQL Queries, PL/SQL Procedures in Oracle, SQL Server and Teradata

Environment: Informatica Power Center, Teradata 15, Oracle 11g, TOAD, SQL Server, PL/SQL, Windows 7, UNIX.

Senior Software Developer

Confidential, Phoenix, AZ

Responsibilities:

  • Worked closely with data modelers to create external tables and views in HIVE Environment
  • Worked closely with data modelers to define the requirements and build Intake and Tollgate forms for PG Classification and DS Approval.
  • Worked closely with business analysts and source system to understand the functional requirements and how data flowing into the system.
  • Created ETL Technical design specification documents based on the functional design documents and the physical data model.
  • Performed lead duties such as reviewed the ETL Source to Target flow documents with other developers, assign the ETL tasks, reviewed their code and help them to resolve the performance bottlenecks.
  • Used Informatica Metadata Manager to do impact analysis and metadata lineage.
  • Used Informatica Data Quality for address validation and SSN validation before processing those into ETL.
  • Developed Slowly Changing Dimensions using Informatica Dynamic Lookup concept to capture same day data variations.
  • Developed Informatica mappings and workflows as per the requirements and based on the existing code which was developed using Informatica Power Center 9.6.
  • Prepared migration artifacts in order to deploy the code to higher environments.
  • Involved in Hadoop Recon testing such as prod parallel runs, data comparison, research if mismatches.
  • Prepared Teradata custom views for user data consumption.
  • Worked with DBA’s to setup user security for sensitive data.
  • Developed standards and procedures for automation process of registration through Unified Data Services (UDS).
  • Created Hive external tables and views based on Hadoop Distributed File System.
  • Registered datasets into UDS Portal to be consumed as HDFS Files.
  • Worked closely with users to set up the functional ID in order to access the Production data.
  • Created Control-M jobs in order to load data into Hive Partitions.
  • Involved in Unit testing, User Acceptance Testing to check whether the data is loading from Hadoop files to Hive, same way as its loading into Teradata.
  • Involved in code migration process to higher environments.
  • Peer reviewed the code developed by other developers.
  • Worked effectively in an Agile managed team.
  • Used extensively Agile tool JIRA to track the tasks/work and log the work at daily basis.

Environment: Informatica, Talend, Teradata, Hadoop, Hive, Oracle, SQL Server, Query-It, SQL Developer, Teradata SQL Assistant, Toad, PL/SQL, Linux Agile, JIRA.

Informatica Developer/Oracle Developer

Confidential, Columbus, OH

Responsibilities:

  • Prepared the required application design documents and ETL Source to Target flow documents based on business requirement and functionality requirement.
  • Designed the ETL processes using Informatica to load data from Oracle and Flat Files (Fixed Width and delimited) to staging database and from staging to the target Oracle Data Warehouse database and salesforce and from Oracle, Flat Files (Fixed Width and delimited) and SQL Server database tables.
  • Extensively used DB2 database Log Analyzer Files and Mainframe Copybooks to load data into the Netezza database. and integrated data coming from outside the Enterprise and managed the business process.
  • Experience of Genesis Copybook(VSAM Copybooks), on how to extract, transform and load data into data warehouse. Experience of working in Mainframe DB2 database environment and FTP’d files to Mainframe.
  • Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.
  • Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, Sequence Generator and Normalizer.
  • Involved in cleansing and extraction of data and defined quality process for the warehouse.
  • Used TDM, Test Data Manager for UAT and PROD data validation and Policy driven data-masking techniques.
  • Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
  • Worked on IDQ Analyst for Profiling, Creating rules on Profiling and Scorecards.
  • Involved in setting workflow dependencies using event wait tasks.
  • Implemented reusable object like mapplet in order to reuse the transformation logics in multiple mappings.
  • Designed and developed Mappings for loading Master Data Management(MDM) Hub.
  • Used Native Hana DB2 Modeling tool for Mainframe database modeling.
  • Designed IDQ mappings which is used as Maplets in Power center.
  • Have experience of sFTP the files to different servers.
  • Experience of using XML and XSD XML as a targets to load the integrated data.
  • Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.

Environment: Informatica Power Center 9.6, Informatica Metadata Manager, MDM Hub, IDQ, Oracle 10g/9i, DB2, Netezza, Mainframe Copybooks, XMl and XSD XML,TOAD, SQL, PL/SQL, SQL Server Management Studio, TDM(Test Data Manager), Teradata, Windows, UNIX, FTP.

Informatica Developer/Oracle developer

Confidential, Miamisburg, OH

Responsibilities:

  • Team coordination for design discussions, code review and mentoring.
  • Worked with Informatica Power Center client tools like Repository Manager, Designer, Workflow Manager, and Workflow Monitor.
  • Worked with Source Analyzer, Target Designer, Transformation Developer, Mapplet Designer, and Mapping Designer.
  • Worked with Task Developer, Worklet Designer, and Workflow Designer in the Workflow Manager.
  • Designed and developed mappings utilizing the Source qualifier, Expression, Filter, Joiner, Lookup, Router, Sorter, Union, Normalizer, Rank, sequence generator and Update Strategy transformations.
  • Implemented Type I slowly changing dimension to update dimensional data in the table.
  • Load the data into Oracle and Netezza database tables using Informatica mappings
  • Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.
  • Implemented the error-handling process using relational database tables.
  • Perform code review for other developers, fixed the invalid mappings and troubleshoot the technical problems of the database.
  • Performance tuning of mappings during UAT stage and maintenance phases.
  • Tracking Production failures and debug the sessions by utilizing the logs of the sessions.
  • Optimized Query Performance, Session Performance and Reliability.
  • Used SQL tools like Toad and SQL developer to run SQL queries and validate the data.
  • Used PNC internal scheduler and Informatica Scheduler for scheduling the jobs.
  • Designed and Implemented the ETL code for Address verification, Identity checking by working with IDQ.
  • Designed IDQ mappings which is used as Maplets in Power center
  • Running MDM jobs till base objects whenever data needed and look at the issues, rejects in raw-staging to landing flow in DEV.
  • Perform Code review and unit testing at various levels of the ETL.
  • Involved in migrating the Mappings, Sessions, Workflows from Dev to Test and Test environment to Production environment.
  • Integration of Unix shell scripting with Informatica PMCMD command utilities.

Environment: Informatica, Oracle 11g, Toad, SQL developer, SQL* Plus, Netezza, Aginity Workbench for Netezza, MDM Hub, IDQ and UNIX, XML, FTP, TDM.

We'd love your feedback!