We provide IT Staff Augmentation Services!

Informatica Developer Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • Over 8 years of IT experience with extensive experience in Data warehousing, Data Analysis, Reporting, ETL, Data Modelling, Development, Maintenance, Testing and Documentation.
  • Over 3 years of experience working in the Data vault modelling approach for the Enterprise Data Warehouse.
  • Strong knowledge of Entity - Relationship concept, Facts and dimensions tables and developing database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional data modelling.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Netezza, Teradata, Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Hands on experience on Teradata 12.0, 13.0, Teradata SQL Assistant.
  • Extensive experience in developing Stored Procedures, Functions, Views, Triggers, Complex SQL queries using SQL Server, TSQL, Teradata and Oracle PL/SQL.
  • Extensively used Teradata Utilities like Tpump, Fast-Load, MultiLoad, BTEQ and Fast-Export, TPT (Teradata Parallel Transporter).
  • Experience in developing XML/XSD/XSLT as a part of Source XML files for Informatica and also Input XML for Web Service Call.
  • Experienced in developing Test Plans, Test Strategies and Test Cases for Data Warehousing projects ensuring the data meets the business requirements.
  • Maintaining and Supporting during Assembly/Integration Test, QA, UAT, and Production for Bug fixes/Defects.
  • Created Projects, Models using Cognos Framework Manager and published packages to Cognos Server for reporting authoring.
  • Good experience in Automation and scheduling of UNIX shell scripts and Informatica sessions and batches using CONTROL-M and WLM scheduling tools.
  • Good experience in SQL tuning and Informatica performance tuning. Tuned performance of Informatica Sessions for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Experience with coordinating and leading onsite-offshore development.
  • Excellent team player with very good communication skills and leadership qualities.

TECHNICAL SKILLS

Operating System: Windows/XP/7/UNIX/Ubuntu

Database and utilities: Oracle 9i/10g/11g, MS SQLServer, Teradata V2R6, 12.0, 13.0 Teradata SQL Assistant, BTEQ, Sybase, Erwin 3.5/4.0, IBM DB2, Business Objects.

Informatica Tools: Informatica PowerCente 10.2/9.6/9.5/8.6/8.1 , IDQ Methodology, Informatica Power exchange, Informatica Analyst.

Database Tools: SQL developer, TOAD

Programming Languages: SQL, PL/SQL, UNIX Shell Script, Python, HTML.

Other Tools: Control-M, WLM,Tivoli

PROFESSIONAL EXPERIENCE

Confidential

Informatica developer

Environment: Informatica PowerCenter 10.2/9.6, IDQ methodology, Power Exchange 9.5, Oracle 11g, Toad for Oracle 12, IBM DB2, VSAM, AIX UNIX 5.2, shell (Korn/bash) Scripting, Control-M.

Responsibilities:

  • To analyse the current Enterprise Data Warehouse environment to understand the data loading, source data normalization processes, database environment, batch jobs, reporting, dashboard publishing systems, Performance tunings, analysis, design and testing and implementation.
  • Implementing ETL solutions, particularly utilizing Informatica Powercenter 10.2.
  • To understand the current application issues, challenges, technology bottleneck if any.
  • To understand the current support requirements from IT team for their support processes, SLA and ticket escalation processes, response, resolution time.
  • Find any gaps that may be present in the processes in managing the support requirements that are needed to be met by the proposed system.
  • Plan and be part of the business time with respect to resources, timelines, efforts involved to do the ongoing ETL, DW environments and application objects that need maintenance and ongoing availability and uptime as specified in the IT guidelines
  • Identify any probable risks that may harm the proposed application support and highlight for further analysis by stakeholders.
  • Develop PL/SQL procedures, for creating/dropping of indexes on tables using target pre-load and post-load strategies to improve session performance in bulk loading.
  • Work with production support in the finalizing scheduling of workflows using Control-M tool and deployed the Workflows in the production server and supported them through automated Shell scripts.
  • Be responsible for Testing and Validating the Informatica mappings against the pre-defined ETL design standards.
  • Involve in the preparation and maintenance of Documentation of ETL using Informatica Standards, procedures and naming conventions.

Confidential

Informatica /Teradata Developer

Responsibilities:

  • Analyzing business processes, functions, existing transactional database schemas and designing star schema models to support the users reporting needs and requirements.
  • Communicating with business users to resolve their issues, follow-up with users and team to resolve the issues.
  • Developed ETL procedures to ensure conformity, compliance with Confidential standards.
  • Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
  • Created Migration documentation and Process Flow for mappings and sessions.
  • Developed New Mappings based on existing business logic from the Main frame (DB2) system to work with Teradata databases as source systems.
  • Working on the enterprise-wide data repository at Confidential called as Enterprise Data Warehouse and Research Depot (EDWard).
  • Generated Delimited Flat files which were sent to Main frame system using FTP in Post Session Command task.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Developed the transformation logic; identifying and tracking the slowly changing dimensions, heterogeneous sources and determining the hierarchies in dimensions.
  • Created volatile tables, global temporary tables, indexes, macros, and stored procedures, Teradata analytical functions for Data quality, fallbacks and table joins, access methods against tables.
  • Teradata Macro, Fastload, Customized Business Views development. Hands on experience leading all stages of system development efforts, including requirement definition, design, architecture, testing, support and administration. Re-conciliation of data & Error handling .
  • Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
  • Analyzed all the critical and High priority defects and fully responsible for fixing it.
  • Involved in various transformations like Source Qualifier, Look up, Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router transformations.
  • Created Informatica server administration automation scripts, Shell scripts for data manipulation and process automation using UNIX Shell scripting.
  • Generated BTEQ scripts for data manipulation, validation and provided DDL scripts for creation of database objects such as tables, indexes, views, sequences, object types, collection types and Materialized Views.
  • Created, Scheduled and Monitored Batches and Sessions using Power Center Server Manager/WLM .

Environment: Informatica PowerCenter 9.6.1, Teradata, BTEQ Scripts, DB2 IBM Clear Case, Citrix WLM, Autosys, Unix Shell, Teradata Utilities, Flat files (fixed width/delimited).

Confidential - Memphis, TN

Informatica IDQ

Responsibilities:

  • Involved in analysis of source systems, business requirements and identification of business rule and responsible for developing, support and maintenance for the ETL process using Informatica.
  • Designed, developed, implemented and maintained Informatica PowerCenter and IDQ 8.6.1 application for matching and merging process.
  • Developed complex mappings using Informatica PowerCenter Designer to transform and load the data from various source systems like Oracle and Sybase into the Oracle target database.
  • Build a re-usable staging area in Oracle for loading data from multiple source systems using template tables for profiling and cleansing in IDQ or Quality Stage.
  • Extensively Worked in Processing Structured and Unstructured data .
  • Maintaining the Quality matrices, Knowledge management and Defect prevention & analysis.
  • Used breakpoints and various test conditions with Debugger tool to test the logic and the validity of the data moving through the mappings.
  • Extensively used the Slowly Changing Dimensions-Type II in various data mappings to load dimension tables in Data warehouse .
  • Using Source Qualifier & Lookup SQL overrides, Persistent caches, Incremental Aggregation, Partitions, and Parameter Files for better performance.
  • Used session logs, workflow logs and debugger to debug the session and analyse the problem associated with the mappings and generic scripts
  • Published different packages from Framework Manager to Cognos Connection.
  • Set security for individual reports in Cognos Connection.
  • Tested all the applications and transported the data to the target Warehouse Oracle tables, schedule and run extraction and load process and monitor sessions and batches by using Informatica Workflow Manager.
  • Created Migration Documentation and Process Flow for mappings and sessions.
  • Migrated mappings from Development to Testing and performed Unit Testing and Integration Testing.

Environment: Informatica PowerCenter 9.5, Informatica IDQ 8.6.1, Oracle 10g/9i, TOAD, Korn shell, Cognos, Control-M, Erwin.

We'd love your feedback!