We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

3.00/5 (Submit Your Rating)

CaliforniA

SUMMARY:

  • Around 6 years of IT experience in Data warehousing with emphasis on Business Requirements Analysis, Application Design, Development, coding, testing, implementation and maintenance of client/server Data Warehouse and Data Mart systems.
  • Worked in Data Warehouse and Business Intelligence Projects along with the team of Informatica (ETL).
  • Experience in Design and Development of ETL methodology for supporting Data Migration, data transformations & processing in a corporate wide ETL Solution using Teradata TD 14.0/13.0/12.0.
  • Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for data warehouses.
  • Involvement in all phases of SDLC (Systems Development Life Cycle) from analysis and planning to development and deployment.
  • Experience in various stages of System Development Life Cycle (SDLC) and its approaches like Waterfall & Agile Model.
  • Implemented and followed a Scrum Agile development methodology within the cross functional team and acted as a liaison between the business user group and the technical team.
  • Experience in scheduling Sequence and parallel jobs using Data Stage Director, UNIX scripts and scheduling tools.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star - Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Experience in UNIX shell scripting for processing large volumes of data from varied sources and loading into databases like Teradata and Vertica.
  • Data and Database Migration including Mainframe to PC database conversions and Data Mapping, retrieval, cleansing, consolidation, mapping, and reporting for client review.
  • Experience in OLTP Modeling (2NF, 3NF) and OLAP Dimensional modeling (Star and Snow Flake) using Erwin (conceptual, logical and physical data models).
  • Significant experience in ETL (Extract, Transform and Loading data) tool Informatica Power Center in analyzing, designing and developing ETL processes for Data Warehousing projects using Informatica Power Center (10.1/9.6.1/9.5.1/9. x/8.x).
  • Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformations, Informatica Repository Manager and Informatica Server Manager.
  • Designed and developed interfaces to load the data from multiple sources like Relational databases, flat files into oracle database.
  • Hands on experience in debugging and performance tuning of sources, targets, mappings and sessions.
  • Experience in integration of various data sources definitions like SQL Server, Oracle, Teradata SQL Assistant, MYSQL, Flat Files, XML and XSDs.
  • Experience on Teradata tools and utilities (BTEQ, Fast load, Multi Load, Fast Export, and TPUMP).
  • Experience in SQL, PL/SQL, UNIX shell scripting
  • Joiner, sorter, Aggregator, JAVA Update Strategy, Filter and Router transformations.
  • Profound knowledge about the architecture of the Teradata database and experience in Teradata Unloading utilities like Fast export.
  • Strong skills in SQL, PL/SQL packages, functions, stored procedures, triggers and materialized views to implement business logic in oracle database.
  • Experience with relational databases such as Oracle 8i/9i/10g,11g SQL SERVER 2005/2008.
  • Worked with various SQL Editors such as TOAD, SQL Plus, and Query Analyzer.
  • Experience with UNIX Shell Scripting (KSH - KORN Shell Scripting).
  • Experienced with scripting languages Unix Shell, Perl.
  • Experience in identifying and resolve ETL production root cause issues.
  • Experience in maintenance, enhancements, performance tuning of ETL code.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate.
  • Good working experience in writing SQL and PL/SQL scripts including views and materialized views.
  • Experience working on Informatica Scheduler for job scheduling.
  • Strong analytical and problem-solving skills.
  • An excellent team member with an ability to perform individually, good interpersonal relations, strong communication skills, hardworking and high level of motivation & Ability to work effectively while working as a team member as well as individually.
  • Desire to learn new skills, technologies, and adapt to new information demands.

TECHNICAL SKILLS:

Operating Systems: Windows, Linux, Unix

Languages: SQL, PL/SQL, Unix Shell Script

Database: Teradata, Oracle, SQL Server

Special Software/ Tools: Putty, SQL Assistant, View Point, TSAM, FastLoad, MultiLoad, FastExport, Tpump, Control M, Crontab, Jobtrac, Ca7, Jira

ETL Tools: Informatica Power center 10.1/9.6.1/9.1/8. x

PROFESSIONAL EXPERIENCE:

Confidential, California

ETL/Informatica Developer

Responsibilities:

  • Responsible for Business Analysis and Requirements Collection.
  • Efficiently worked in all the phases of System development life Cycle (SDLC) using different methodologies like Agile and Waterfall.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Performed requirement gathering, analysis, design, development testing implementation support and maintenance phases of Data Integration Projects.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Designing and building Informatica solution and PDO(Push down optimization) where required
  • Extensively worked with Informatica Power Center.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Analyzed source data files and gathered requirements from the business users.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Modified existing mappings for enhancements of new business requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Prepared the DML's for maintenance tables, review, and test and execute them.
  • Used Github for version control tool for version controlling and movement of code to upper environments like SIT, UAT, Pre-production and Production.

Environment: Informatica PowerCenter10.1, SQL Assistant, Unix, oracle11/g, Control M

Confidential, Princeton, NJ

ETL Informatica Developer

Responsibilities:

  • Worked closely with business analyst and Data Warehouse architect to understand the source data and the need of the Warehouse.
  • Involved in Unit and Integrating testing of Informatica Sessions, Batches and the Target Data
  • Involved in migration of mappings and sessions from development repository to test and production repository.
  • Running the daily jobs for Raw and clean data files and loading it to the ABC tables.
  • Created the Autosys jobs and scheduled those as per the requirement.
  • Implementing the code in wherescape red and sending the data to the extract files.
  • Extensively involved in tuning the mappings, sessions and the Source Qualifier query.
  • Designed and created ETL mappings using Informatica mapping designer.
  • Worked extensively on performance tuning by making changes to SQL in source qualifier.
  • Creation of facts and dimensions according to the business requirements.
  • Experienced in identifying and documenting data integration issues, challenges such as duplicate data, non-conformed data, and unclean data.

Environment: Informatica Power Center 10.0.1,Wherescape 2.1.0, Autosys, Oracle 12g, MS Access, MS SQL Server 2008, Unix, Shell Scripting, Greenplum, DB2, SQL PL/SQL,XML SQL*Plus

Confidential, Princeton NJ

Informatica Developer

Responsibilities:

  • Involved in understanding the Requirements of the End Users/Business Analysts and Developed Strategies for ETL processes.
  • Developed various graphs to process Contract, Group, Member and Pharmacy Claims data based on business requirements utilizing functionalities like Rollup, Lookup, Scan, etc.
  • Extracted data from different sources like MVS data sets, Flat files (“pipe” delimited or fixed length), excel spreadsheets and Databases.
  • Managed all development and support efforts for the Data Integration/Data Warehouse team.
  • Used Informatica power center 9.0.1 to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
  • Understand the structure of data, build data architecture and implement data model in Vertica, and carry out data mapping from legacy Oracle system to Vertica.
  • Saved resources from a claims tracking process by modeling a Claims data mart to contain aggregated claims & Rolled up claims by coding SQL stored procedures, Informatica (ETL) and UNIX scripts.
  • Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager.
  • Prioritized requirements to be developed according to Agile methodology.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the
  • Provided on call support during the release of the product to low level to high level Production environment.
  • Worked with Crontab scheduling tool for jobs scheduling.
  • Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.

Environment: Informatica Power Center 9.5.1, SQL Assistant, Oracle, Unix, oracle 10/g, TPT, SQL Server, WLM, Clear Case, FTP.

Confidential

Informatica Developer

Responsibilities:

  • Worked on Informatica - Source Analyzer, Data Warehousing Designer, Mapping Designer, Mapplets and Transformations.
  • Involved in the development of Informatica mappings and performance tuning.
  • Used most of the transformations such as the Source Qualifier, Aggregators, Lookups, Filters and Sequence.
  • Designed the procedures for getting the data from all systems to Data Warehousing system.
  • Extensively used ETL to load data from different databases and flat files to Oracle.
  • Performing GAP Analysis for various EDI transactions like 810, 850, 856.
  • Extensively worked on Database Triggers, Stored Procedures, Functions and Database Constraints. Written complex stored procedures and triggers and optimizers for maximize performance.
  • Creating and running Sessions & Batches using Server Manager to load the data into the Target Database.
  • Using Export & Import Utilities and SQL Loader for data refreshing from production to development environment.
  • Involved in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible.
  • Troubleshooting while testing is in progress in the development environment, which includes monitoring of Alert Log, Trace file and fixing software bugs.
  • Collected Multi-Column Statistics on all the non-indexed columns used during the join operations & all columns used in the residual conditions.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated subqueries, usage of Hash functions, etc.

Environment: Informatica 8.x, Erwin, Oracle 9i, DB2, SQL Loader, SQL Developer, Flat files, UNIX

We'd love your feedback!