We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

5.00/5 (Submit Your Rating)

GA

SUMMARY

  • 12+ years of experience in designing, developing and maintaining large business applications such as data migration, integration, conversion, data warehouse and testing which includes 2+ years of experience in Hadoop, Big data ecosystem related technologies.
  • Experience includes thorough domain knowledge of Business Financial system, Banking, Animal healthcare information technology, Insurance & Reinsure, Pharmacy claims systems, Telecom Industry.
  • Professional understanding of Software Development Life Cycle (SDLC).
  • Expertise in data warehousing, ETL architecture, Data Profiling using Informatica Power center 9.6/9.5/9.1/8.6/8.5/8.1/7.1 - Client and Server tools.
  • Used Informatica Metadata Manager and Metadata Exchange exhaustively to maintain and document metadata.
  • Through knowledge of Relational & Dimensional models (Star & Snow Flake), Facts and Dimension tables, Slowly Changing Dimensions (SCD)
  • Hands on experience in implementing Slowly Changing dimension types (I, II &III) methodologies
  • Experience in integration of various data sources like SQL Server, Oracle, TeraData, Vertica, Netezza, Flat files, NoSQL, DB2 Mainframes into the staging area
  • Business Requirements review, assessment, gap identification, defining business process, deliver project roadmap including documentation, initial source data definition, mapping, detailed ETL development specifications and operations documentation
  • Expert in Installation and configuration of Informatica server with Sql server, Oracle and able to handle the Informatica administrator tasks like Configuring DSN, creating Connection Strings, copying & moving mappings, workflows, creating folders etc,.
  • Experience in complex PL/SQL packages, functions, cursors, triggers, views, materialized views, T-SQL, DTS, and SSIS.
  • Expert in troubleshooting/debugging and improving performance Confidential different stages like database, workflows, mapping. Experience in writing UNIX shell scripts and Perl scripts
  • Hands on experience in application development using RDBMS, and Linux shell scripting.
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager and Map Reduce programming paradigm.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.Have Got Proven experience leading Team.
  • Enthusiastic and goal-oriented team player possessing excellent communication, interpersonal skills and leadership capabilities with high level of adaptability.

TECHNICAL SKILLS

Data warehousing: Informatica Power Center 9.5/9.1/ 8.6/8.5/8.1/7.1, Metadata Manager, Informatica Power Connect for Siebel/SAP/PeopleSoft, Power Exchange for DB2, Metadata Reporter Data Profiling, Data cleansing, OLAP, OLTP, Star & Snowflake Schema, FACT & Dimension Tables, Physical & Logical Data Modeling, Data Stage 7.x, Erwin 4.0, ER Studio, Dial, SSIS

BI Tools: Business Objects XIR 2/6.0/5.1/5.0, Cognos, QlickView, Yotta

Databases: SQL Server, Oracle 10g/9i/8i/8/7.3, Sybase, TeraData 6,My SQL, MS-Access, DB2 8.0/7.0, SeaQuest, Vertica, Netezza, Hive (NoSQL)

Languages: XML, SQL, T-SQL, PL/SQL, UNIX Shell Scripting

Operating System: HP-UX11/10/9,IBM-AIX4.0/3.1, Sun Solaris 9/8/7/2.6/2.5, SCO-UNIX, LINUX, Windows XP Professional/2000/NT/98/95

Other Tools: Autosys, Control M, Remedy, Mercury Quality center, Star Team, Lotus Notes, Tidal

DB Programming & Tools: RDBMS, Joins, Indexes, Views, Functions, Triggers, Clusters, Procedures, SQL*Plus, SQL*Loader, Export/Import, TOAD, SQL Navigator, Explain plan, SQL Trace, DB Visualizer

PROFESSIONAL EXPERIENCE

Confidential, GA

Sr. Informatica Developer

Responsibilities:

  • Worked with the business and data analysts in requirements gathering and to translate business requirements into technical specifications.
  • Created number of complex Mappings, Mapplets, Reusable Transformations, Workflows, Worklets, Sessions using Informatica Power center 9.0.1 to implement the business logic and to load the data incrementally.
  • Extracted the data from the Flat files, CSV file, Oracle databases into staging area and populated onto Data warehouse.
  • Used Workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process. Used Debugger to test the mappings and fixed the bugs.
  • Worked on Performance tuning Confidential source, target, mappings, sessions, and system levels.
  • Migrated ETL objects from DEV to UAT to PRD environments using Repository Manager.
  • Worked on the UNIX scripts for running the workflows and threshold check for the incoming files.
  • Prepared ETL Design-Document and Implementation Plan documents.
  • Involved in Unit Testing and created various test cases and reviewed and fixed issues in the ETL code.
  • Supported UAT and resolved issues. Also, worked on CR’s (change request) by the business.
  • Prepared Support Turnover documents in Production Support, for some interfaces.

Environment: Informatica Power enter 9.0.1, Oracle 11g, SQL Server, Sun Solaris UNIX Shell Scripts, Tidal, and Remedy.

Confidential, GA

Sr. ETL Developer / Data Modeler

Responsibilities:

  • Created prototype reporting models, specifications, diagrams and charts to provide direction to system programmers.
  • Involved in Architecture and design of Data extraction, transformation, cleaning and loading.
  • Involved in Requirement gathering and source data analysis for the Data warehousing projects.
  • Involved in the Logical and Physical design and creation of the ODS and data marts.
  • Involved in Analysis & Marketing Team to make business decisions
  • Involved in all phases of SDLC from requirement, design, development, testing, pilot, training and rollout to the field users and support for production environment.
  • Prepared Technical documents for all the modules developed by our team.
  • Interacting extensively with end users on requirement gathering, analysis and documentation
  • Documented methodology, data reports and model results and communicated with the Project Team / Manager to share the knowledge
  • Analyzed business requirements, transformed data, and mapped source data using the MS SQL
  • Imported Data from relational database into SAS files per detailed specifications
  • Carried out data extraction and data manipulation using PROC SQL, PROC SORT, PROC REPORT to create preferred customer list as per business requirements
  • Developing T-SQL queries, triggers, functions, cursors and stored procedures
  • Created new database objects like Tables, Procedures, Functions, Indexes and Views using T-SQL in Development and Production environment for SQLServer 2008 R.
  • Tuning queries which are running slow using Profiler and Statistics Io by using different Methods in terms of evaluating joins, indexes, updating Statistics and code modifications.
  • Used DDL and DML for writing triggers, stored procedures, and data manipulation.
  • Created views to restrict access to data in a table for security.
  • Generated Reports using Global Variables, Expressions and Functions for the reports.
  • Created datasets in T-SQL, stored procedures for SAS Reports
  • Experience in performance tuning and optimization of queries and stored procedures.
  • Performed daily tasks including backup and restore by using SQL Server 2005 tools like SQL Server Management Studio, SQL Server Profiler, SQL Server Agent, and Database Engine Tuning Advisor.
  • Responsible for monitoring and making recommendations for performance improvement in hosted databases. This involved index creation, index removal, index modification, file group modifications, and adding scheduled jobs to re-index and update statistics in databases.

Environment: SQL server 2008 R, T-SQL, SSIS, SSRS, Windows Server 2008R, SAS BI Dashboard, Visio

Confidential, GA

Sr. ETL Developer / Lead

Responsibilities:

  • Experience in Data Warehouse/Data Mart design, System analysis, Database design, ETL design and development, SQL, PL/SQL programming.
  • Created mapping between the databases and identified the possibilities of incorporating the new business rules with the existing design.
  • Created prototype reporting models, specifications, diagrams and charts to provide direction to system programmers.
  • Prepared High level design (HLD), LLD, Project functional Specification Document and Business Requirement documentation (BRD).
  • Participated in Design team brainstorm and user requirement gathering meetings.
  • Played the lead role in Designing and implementing Pricing Analytics, Best Customer and Finance Master Data Alignment Services Data marts.
  • Experience in managing the delivery of data extracts from the data sources to the warehouse and downstream applications.
  • Experience in requirement gathering and database design and implementation of star-schema, snowflake schema/dimensional data warehouse using Erwin.
  • Played the role of a Data warehouse architect in the Business Intelligence Enterprise Architect group for providing data architectural strategies.
  • Experience in Architecture and design of Data extraction, transformation, cleaning and loading.
  • Involved in Requirement gathering and source data analysis for the Data warehousing projects.
  • Involved in the Logical and Physical design and creation of the ODS and data marts.
  • Converted the Business rules into Technical Specifications for ETL process.
  • Involved in all phases of SDLC from requirement, design, development, testing, pilot, training and rollout to the field users and support for production environment.
  • Implemented mapping techniques for Type 1, Type 2 and Type 3 slowly changing dimensions.
  • Developed ETLs for Data Extraction, Data Mapping and Data Conversion using Informatica Power Center.
  • Involved in the Development of Informatica mappings and mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
  • Worked on Informatica Power center 9.5/9.6 Tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.
  • Wrote scripts for collection of statistics, reorganization of tables and indexes, and creation of indexes for enhancing performance for data access.
  • Involved in Testing and Test Plan Preparation, Process Improvement for the ETL developments.
  • Involved in Data Quality Assurance, created validation routines, unit test cases and test plan.
  • The design document was finalized through conduct of design walkthroughs, and independent review by the user. Prepared Technical Specifications for all the utilities, as per the company's standards.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Informatica mappings.
  • Designed and developed Hadoop system to analyze the SIEM (Security Information and Event Management) data using MapReduce, VSQL, and Sqoop
  • Migrated data from SQL Server, Sea Quest (HP internal Database) to HBase using Sqoop.
  • Configured various big data workflows to run on top of Hadoop and these workflows comprise of heterogeneous jobs like VSQL, Sqoop and MapReduce.
  • Ability to adapt to evolving technology, strong sense of responsibility and accomplishment.
  • Prepared Technical documents for all the modules developed by our team.

Environment: Erwin, Informatica Power Center 9.5, Metadata Manager, SQL server 2008, SeaQuest Database (HP Internal DB), Dial (HP Internal ETL Tool), Yotta (HP Internal Reporting tool), Hadoop, Vertica, VSQL, Sqoop, MapReduce

Confidential, Manhattan, NY

Sr. Informatica Developer

Responsibilities:

  • Worked with the business and data analysts in requirements gathering and to translate business requirements into technical specifications.
  • Understand and discuss the ETL requirements with the Business Unit and preparing the detail design documentation according to the standards.
  • All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager. Ver. 9.1
  • Used the transformations like Expression, Lookup, Source Qualifier, Transaction Control, XML Generator, Web Service, XML Parser, Joiner, etc.
  • Extracted the high volume dataset from the XML, Netezza Relational Tables and xml targets.
  • Used Workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process.
  • Use debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations and Create Mapplets that provides reusability in mappings.
  • Involved in Performance tuning Confidential source, target, mappings, sessions, and system levels.
  • Performance tuning has been done to increase the through put for both mapping and session level and SQL Queries Optimization as well.
  • Monitor and troubleshoot batches and sessions for weekly and Monthly extracts from various data sources across all platforms to the target database.
  • Data Validation has been done for various sources to target mappings.
  • Migrated ETL objects from DEV to UAT to PRD environments using Repository Manager.
  • Provided support and quality validation thru test cases for all stages Unit and Integration testing.
  • Prepared ETL Design-Document and Implementation Plan documents.
  • Involved in Unit Testing and created various test cases and reviewed and fixed issues in the ETL code.
  • Supported UAT and resolved issues. Also, worked on CR’s (change request) by the business.

Environment: Informatica Power Center 9.1, SQL Server, Oracle 9i, Netezza, Remedy.

Confidential, Raleigh, NC

Informatica Developer

Responsibilities:

  • Worked with the business and data analysts in requirements gathering and to translate business requirements into technical specifications.
  • Created number of complex Mappings, Mapplets, Reusable Transformations, Workflows, Worklets, Sessions using Informatica Power center 9/8.6.1 to implement the business logic and to load the data incrementally.
  • Extracted the data from the Flat files, CSV file, Oracle databases into staging area and populated onto Data warehouse.
  • Used Workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process. Used Debugger to test the mappings and fixed the bugs.
  • Worked on Performance tuning Confidential source, target, mappings, sessions, and system levels.
  • Migrated ETL objects from DEV to UAT to PRD environments using Repository Manager.
  • Worked on the UNIX scripts for running the workflows and threshold check for the incoming files.
  • Prepared ETL Design-Document and Implementation Plan documents.
  • Involved in Unit Testing and created various test cases and reviewed and fixed issues in the ETL code.
  • Supported UAT and resolved issues. Also, worked on CR’s (change request) by the business.
  • Prepared Support Turnover documents in Production Support, for some interfaces.

Environment: Informatica Power enter 8.6/7.1, Oracle 9i, SQL Server, Sun Solaris UNIX Shell Scripts, Erwin, Autosys, and Remedy.

Confidential, Jersey City, NJ

Sr. Informatica Developer

Responsibilities:

  • Involved in data analysis and development to understand attribute definitions for migration.
  • Prepared mapping specifications documents, unit testing documents for developed Informatica mappings.
  • Prepared mapping documents for loading the data from legacy system, flat files to staging area.
  • Prepared mapping documents for loading the data from staging area to IDS.
  • Defined Target Load Order Plan for loading data into different Target Tables.
  • Worked on Informatica Power center 8.6/8.5 Tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy transformation.
  • Involved in the design and development of mappings from legacy system to target database.
  • Involved in performance tuning of the mappings by doing necessary modification to reduce the load time.
  • Used SQL tools like TOAD to run SQL queries and validate the data.
  • Defined test cases and prepared test plan for testing ODS jobs.
  • Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.
  • Worked closely with the business analyst’s team in order to solve the Problem Tickets, Service Requests. Helped the 24/7 Production Support team.

Environment: Informatica PowerCenter 8.6/8.5, Informatica Power Exchange, COGNOS 8.3, SQL server 2005, Windows Vista

Confidential, Brick, NJ

ETL Developer

Responsibilities:

  • Project Life Cycle - from analysis to production implementation, with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into different targets.
  • Used to extract data from ODS and load it into EDW using Informatica.
  • Performed Repository object migration from Development to testing and testing to production Environments.
  • Worked with pre and post sessions, and extracted data from Transaction System into Staging Area.
  • Performed Unit Tests and tested Mapping logic by passing sample messages.
  • Responsible for Self and Peer Review of Informatica mappings under Development Phase.
  • Schedule and Run Extraction and Load process and monitor sessions using Informatica Workflow Manager.
  • Scheduled the tasks to be run using the Workflow Manager.
  • Implemented various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router etc.
  • Created and Monitored Sessions and various other Tasks such as Event-Raise Task, Event-Wait Task, Decision Task, Command Task etc. using Informatica Workflow.
  • Worked on the UNIX scripts for running the workflows and threshold check for the incoming files.

Environment: Informatica PowerCenter 7.1, Oracle 9i, PL/SQL, SQL Server 2000, Flat files, Sql loader, UNIX Shell Scripts, Micro strategy 8, Erwin.

Confidential, Hartford, CT

ETL Developer

Responsibilities:

  • Extensively worked on Business Analysis and Data Analysis.
  • Interacted with the users to convert business logic into ETL specifications.
  • Extensively analyzed and used Ralph Kimball approach for building Data warehouse.
  • Worked with Power Center - Designer tool in developing mappings and mapplets to extract and load the data from flat files, Oracle to Oracle.
  • Created different transformations like Source Qualifier, Joiner transformation, Update Strategy, Lookup transformation, Rank Transformations, Expressions, Aggregator, Sequence Generator for loading the data into targets.
  • Assisted the team in the development of design standards and codes for effective ETL procedure development and implementation.

Environment: Informatica Power Center 7.1, Oracle 8i, MS SQL Server 7.0/2000, MS Excel 97, Flat files, PL/SQL, SQL, Windows 2000, UNIX

Confidential

Programmer Analyst

Responsibilities:

  • Participated in discussions with clients to better understand business requirements
  • Created forms for new policy entry details.
  • Responsible for maintaining Policies as per the customer requirement
  • Wrote PL/SQL Stored Procedures for data extraction.
  • Creation of Reports that allows the user to retrieve complete information as required for the financial status like monthly reports, day to day transaction etc.

We'd love your feedback!