We provide IT Staff Augmentation Services!

Informatica/teradata Developer Resume

2.00/5 (Submit Your Rating)

Richmond, VA

SUMMARY:

  • Dynamic IT professional with 8+ years of total IT experience and Technical proficiency in the Data Warehousing teamed with Business Requirements Analysis, Application Design and Data Modeling.
  • Extensive experience in Informatica Power Center to implement data marts which involve creating, debugging and executing mappings, sessions, tasks, and workflows and testing SQL queries.
  • Experience in integration of various data sources like Oracle, DB2, SQL Server, Flat Files, XML, and Mainframe into Warehouse.
  • Well versed in Data modeling concepts incorporating dimensional modeling (star and snowflake schema), logical and physical data modeling.
  • Experience on Informatica designer tools - transformations, reusable transformations, mappings, and mapplets, DT Studio includes DT/Engine, DT/Designer and DT console.
  • Experience working in Oracle8i/9i/10g/11g with database objects like triggers, stored procedures, functions, packages, views, and indexes.
  • Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions
  • Hands on experience with Informatica ILM Workbench for data masking and TDM.
  • Data integration with SFDC and Microsoft Dynamics CRM using Informatica cloud.
  • Proficient in performance tuning of Informatica Mappings, Transformations, and Sessions experienced optimizing query performance.
  • Hands on experience creating mappings, Jobs, application and import sources to TDM repository, creating rules and Policies.
  • Worked extensively with Informatica Workflow Manager (using tools such as Task Developer, Worklet and Workflow Designer, and Workflow Monitor) to build and run workflows.
  • Experienced in using Informatica Data Quality (IDQ) tools for Data Analysis.
  • Expert in using Informatica Power Exchange CDC (Change Data Capture) with Oracle database including DC (Data conversion).
  • Experience in several facts of MDM implementations including Data Profiling, Data extraction, Data validation, Data Cleansing, Data Match, Data Load, Data Migration, Trust Score validation.
  • Experienced in Installing, Managing and configuring Informatica MDM core component such as Hub Server, Hub Store, Hub Cleanse, Hub Console, Cleanse Adapters, Hub Resource Kit.
  • Experience working in an Agile environment.
  • Expertise in OLTP/OLAP System study, Analysis and E-R modeling, developing database schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional data modeling.
  • Experience working with Teradata utilities like BTEQ scripting, fast Load and Multi Load.
  • Experience using automation tools such as Autosys and Maestro for scheduling workflows.
  • Identified and fixed bottlenecks, and tuned the complex Informatica mappings for better performance.
  • Experience of Data Marts, Data warehousing, Operational Data Store (ODS), OLAP, Star Schema Modeling
  • Strong experience in designing and working with NoSql databases.
  • Excellent exposure on Software Development Life Cycle (SDLC) and Onshore/Offshore model.
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
  • Experience in building a very large data warehouse with many complex transformations, with many sources and targets.
  • Extracted data from various sources like Oracle, DB2, flat files, SQL Server, Mainframes, Teradata and loaded into Teradata and Oracle databases.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 8.6, 9.1, 9.6, Informatica Cloud, Teradata

DQ Tools: Informatica Data Quality

Languages: Oracle, SQL.PLSQL

OS: UNIX, Windows

Database: Oracle 10g, 11g, SQL Server 2005, 2008R2, 2012, MySQL, . DB2, Teradata

Conceptual Domain: BI &Data Warehouse Methodology

Productivity tools and utilities: Tidal, Informatica Scheduler

PROFESSIONAL EXPERIENCE:

Confidential, Richmond, VA

Informatica/Teradata Developer

Responsibilities:

  • Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse.
  • Worked with ETL Developers in creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate Shire's data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub
  • Analyzed the system for the functionality required as per the requirements and created System Requirement Specification document (Functional Requirement Document).
  • Created Informatica Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Normalizer, Union, and Connected and Unconnected lookups, Filters, Sequence, Router and Update Strategy.
  • Involved in the development of the conceptual, logical and physical data model of the star schema using ERWIN.
  • Master Data Management MDM Data Integration concepts in large scale implementation environments
  • Extensively used ETL to load data from heterogeneous sources like flat files (Fixed Width and Delimited), Oracle tables, and DB2 tables to the data warehouse.
  • Worked on Informatica Test Data Management Data Masking Tool for data subset and data security.
  • Developed various complex mappings using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure and Sorter transformations.
  • Developed PL/SQL procedures for processing business logic in the database and used them in the Stored Procedure Transformation.
  • Identified and fixed bottlenecks and tuned the complex Informatica mappings for optimized performance.
  • Copied subset of secured data from Production databases to development and Testing environments of Test Data Management (TDM)
  • Developed Cloud mappings to extract the data for different regions
  • Fixed invalid mappings, tested stored procedures and functions, and performed unit testing of Informatica Sessions and Workflows.
  • Prepared and documented unit test plans and resolved issues escalated from system testing.
  • Used Teradata utilities like Fast Load, Multiload, Fast Export and BTEQ.
  • Loaded data into Teradata data warehouse.
  • Worked on supporting all the ETL Inbounds and Outbound of TDM in production environment with SOA gateway
  • Collaborated with Project Manager, Tech Lead, Developers, QA teams and Business SMEs to ensure delivered solutions optimally support the achievement of business outcomes.
  • Implemented various slowing changing mappings based on the related Star Schema data model.
  • Created logical and physical data models using ERWIN.
  • Automated UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency.
  • Worked on UNIX shell scripting and automation of ETL processes using scheduling tool Autosys.
  • Used Incremental aggregation for CDC (Change Data Capture).
  • Worked with BI teams in creating reports using BO and involved in developing report using BO and developed tabular matrix, parameterized chart reports.

Environment: Informatica Power Center 9.6.1/9.0.1, Informatica Multi domain MDM 9.5.0, TDM, Oracle 11g, BPM, Teradata 14.0, Flat files, Erwin 7.2, SQL Developer & Toad, shell scripting, Autosys.

Confidential, Ontario, CA

Sr. ETL/ Informatica Developer

Responsibilities:

  • Used Informatica Power Center to create data maps for mainframe files which involve creating debugging and executing mappings, sessions, tasks, and workflow.
  • Created Informatica Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Normalizer, Union, and Connected and Unconnected lookups, Filters, Sequence, Router and Update Strategy.
  • Created and monitored sessions and batches using Server Manager to load data into the target database.
  • Created Queries, Query Groups and packages in MDM Hub Console
  • Involved in fixing invalid Mappings, testing of Stored Procedures and Testing of Informatica Sessions, Batches and the Target Data.
  • Responsible for updating web application code and queries based on business to business (B2B) requirements.
  • Worked on creating rules and policies using Informatica TDM.
  • Extracted the raw data from Microsoft Dynamics CRM to staging tables using Informatica Cloud.
  • Importing the IDQ Mapplets to Informatica and using them in the Informatica mappings.
  • Involved in migration of the Mapplet's from IDQ to power center.
  • Assisting with design, build and testing of the Metadata Hub management.
  • Designed and developed Unix Shell scripts to generate the run time parameters for the underlying ETL Informatica workflows from the DB Metadata entries.
  • Create Oracle/Netezza views, tables as facts or dimensions based on requirement.
  • Used utilities of FLOAD, MLOAD, FEXP of Teradata and created batch jobs using BTEQ.
  • Imported Metadata from Teradata tables.
  • Developed complex mappings to load data from Source System (Oracle) and flat files to Teradata.
  • Worked on Teradata SQL Assistant querying the source/target tables to validate the BTEQ scripts.
  • Converting unstructured data into target database withB2B tool of Informatica.
  • Involved in loading of data into Teradata from legacy systems and flat files using complex scripts.
  • Analyzing existing database schemas and designing star schema models to support the users reporting needs and requirements.
  • Deliver end-to-end Data Archive and Data Subset solutions using Informatica Power Center, ILM and TDM.
  • Cleansed the source data using Informatica Data Quality tool, extracted and transformed data with business rules, built reusable mappings, known as ‘Mapplets’ using Informatica Designer.
  • Involved in the development of Informatica mappings and used Informatica partitioning (parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities) for performance tuning thereby reducing the load time.
  • Worked on Main frame files, created copy books, imported the source files and set up the respective connections.
  • Worked closely with the admin and helped in masking of sensitive information using Informatica TDM.
  • Created source to target mappings, data repository and data models for the Data Warehouse.
  • Development of Data Warehouse Model and ETL Strategy
  • Created and Maintained Oracle stored procedures (PL/SQL)
  • Developed various PL/SQL packages, stored procedures and functions along with various database objects like views, cursors, ref cursors, indexes and triggers as required.

Environment: Informatica Power Center, Oracle11g, TDM, Informatica MDM (Formerly Siperian) 9.1, Informatica B2B, IDQ, SQL, NoSql, TOAD 9.5, UNIX, Teradata, Oracle eTRM 12.0.0,Metadata, Windows2000/XP/7, Autosys, Erwin 4.1, UC4.

Confidential, Foster City, California

Informatica Developer

Responsibilities:

  • Designed and developed complex mappings in Informatica to load the data from various sources using different transformations such as SQL, Source qualifier, Look up (connected and unconnected), Expression, Aggregate, Update strategy, Sequence generator, Joiner, Filter, Rank, and Router transformations.
  • Designed and developed mappings, defined workflows and tasks, monitored sessions, exported and imported mappings and workflows, backups, and recovery.
  • Extensively worked with database and Informatica Partitioning for performance tuning.
  • Extensively designed and developed reusable transformations & aggregations and created target mappings that contain business rules.
  • Implemented Informatica Data quality to do data cleansing, data matching, reporting and monitoring the XML files
  • Involved in writing UNIX shell scripts for Informatica ETL tool to run the Sessions.
  • Responsible for developing data ingestion routines, creating standards and reusable code and performing Informatica Data Transformation (B2B) development.
  • Worked with different sources like Oracle, flat files, XML files, DB2, MS SQL Server.
  • Developed complex Teradata SQL code in BTEQ script using OLAP and Aggregate functions to name few.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Designed and Developed the Netezza SQL Scripts as per the Customer requirements.
  • Performance tuning of Netezza SQL scripts.
  • Used Power center Manager for managing repository, Imported and Exported items between different Informatica systems metadata with other data warehousing tools.
  • Build UNIX shell scripts to automate the ETL processes.
  • Created and configured workflows, worklets, and sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Created several packages to set up and share global variables, types and transforms which were extensively used for many Ab Initio graphs.
  • Fixed invalid mappings, tested stored procedures and functions, and performed unit testing of Informatica Sessions and Workflows.
  • Implemented Slowly Changing Dimensions (SCD) to update the dimensional schema.

Environment: Informatica Power Center, Oracle11g, SQL, TOAD 9.5, UNIX, Autosys, Netezza, Metadata, Teradata, Erwin 4.1, Putty, OBIEE.

Confidential, Kansas City, MO

Informatica Developer

Responsibilities:

  • ETL design and development using Informatica Power Center toolset.
  • Created Informatica Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Normalizer, Union, and Connected and Unconnected lookups, Filters, Sequence, Router and Update Strategy.
  • Create Oracle/Netezza views, tables as facts or dimensions based on requirement.
  • Technical documentation of test cases and results for all development activities.
  • Job workflow definition and setup, Job monitoring and issue resolution.
  • Designed and developed mappings, defined workflows and tasks, monitored sessions, exported and imported mappings and workflows, backups, and recovery.
  • Extensively worked with database and Informatica Partitioning for performance tuning.
  • Extensively designed and developed reusable transformations & aggregations and created target mappings that contain business rules.
  • Implemented complex ETL logic using SQL overrides in the source qualifier.
  • Used Incremental Aggregation in the Aggregator transformation to make sure the measures for certain aggregate tables got calculated properly
  • Implemented Change Data Capture (CDC) to extract information and Partitioned sessions for concurrent loading of data into the target tables.
  • Fixed invalid mappings, tested stored procedures and functions, and performed unit testing of Informatica Sessions and Workflows.
  • Implemented Slowly Changing Dimensions (SCD) to update the dimensional schema.

Environment: Informatica Power Center/Power Exchange Development 9.1.0, Oracle 11g, SQL, TOAD 9.5, UNIX, Oracle eTRM 12.0.0,PL/SQL, Netezza, Windows2000/XP/7,Vermillion, Autosys, Erwin 4.1, Putty, OBIEE.

Confidential

Datawarehouse ETL Developer

Responsibilities:

  • Gathering and analysis of Business requirements and converting them into specifications and worked as a team member for design and development of Data Mart and integrating it with Global Data Warehouse.
  • Used BTEQ utility to communicate with Teradata RDBMS systems.
  • Extracted data from DB2 mainframe by using Informatica Power Exchange and also used it for change data capture and to set up restart token file.
  • Designed and developed a robust end to end ETL process for the efficient extraction, transformation and loading of Sequential flat files to the staging and then to the data mart.
  • Worked with the data analyst to analyze Source system data and Designed data validation strategy.
  • Created the database objects like tables, stored procedures, indexes, etc and partitioned the tables were the data volume is huge, to improve the reporting performance.
  • Implemented complex ETL logic using SQL overrides in the source qualifier.
  • Used Incremental Aggregation in the Aggregator transformation to make sure the measures for certain aggregate tables got calculated properly.
  • Used stored procedure transformation to integrate the business logic implemented in Oracle stored procedures with Informatica.
  • Created shell scripts to handle File Management that involved file transfer, external auditing and file archiving.
  • Implemented complex logic in the transformations to cleanse the customer data periodically by creating reusable Mapplets.
  • Created workflows, worklets and designed dependency among them using workflow manager.
  • Worked with tools like TOAD and SQL tools to write queries and generate the results.

Environment: Informatica Power Center 8.1, Oracle 9i, SQL Server 2005, SQL *Plus, Toad 8.6, Erwin 4.0, UNIX, Shell Scripts, Teradata, Mainframes, COBOL, Business Objects 6.5.

We'd love your feedback!