We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

0/5 (Submit Your Rating)

Bethpage, NY

SUMMARY

  • 8+ years of IT experience in analysis, design and development for various Software applications in client - server environment and providing Business Intelligence solutions in Data Warehousing for Decision Support Systems and OLAP application development
  • 5+ years of as an ETL Analyst and ETL Developer in Data Warehouse / Data Marts using Informatica Power Center
  • 5+ years of experience in Oracle, SQL, PL/SQL and UNIX shell scripting
  • Worked on multiple client specific environments related to Financial, Tele-Communications, Banking and Insurance
  • Extensively used ETL methodologies for supporting data Extraction, Transformation and Loading (ETL) process in a corporate-wide-ETL solution using Informatica Power Center
  • Experience in using Data sources/targets such as Oracle 11g/10g/9i/8i, SQL Server 2008/2005, Teradata, Netezza, DB2, XML and Flat files
  • Worked extensively on various Informatica Data Integration components - Repository Manager, Designer and Workflow Manager/Monitor
  • Vast experience in Designing and developing complex mappings from varied transformation logic like Unconnected and Connected Lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc
  • Good understanding of Data warehouse concepts and principles, Kimball & Inman approaches, Star & Snowflake Schema, Fact/Dimension tables, Normalization/De normalization
  • Expertise in Data Analysis, Data Mapping, Data Modeling, Data Profiling and development of Databases for business applications and Data warehouse environments
  • Extensively involved in creating Oracle PL/SQL Stored Procedures, Functions, Packages, Triggers, Cursors, and Indexes with Query optimizations as part of ETL Development process
  • Proficiency in Data Warehousing techniques for Data Cleaning, Slowly Changing Dimension (SCD) phenomenon, Surrogate Key assignment and Change Data Capture (CDC)
  • Strong skills in data analysis, data requirement analysis and data mapping for ETL processes
  • Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions
  • Have good Data Analysis and Data Validation skills
  • Designed Source to Target mappings, Code Migration, version control, scheduling tools, Auditing, shared folders, data movement, naming in accordance with ETL best practices, Standards and Procedures
  • Worked with Business Managers, Analysts, and end users to correlate Business Logic and Specifications for ETL Development
  • Exposure to plan and execute all phases of System Development Life cycle (SDLC) across System Analysis, Technical Specifications, Design, Development, Maintenance, Unit Testing, Integration testing, Regression Testing, UAT, Implementation, Work flow design, Documentation and Production/Application Support
  • Have good communication skills, strong decision making and organizational skills along with outstanding analytical and problem solving skills to undertake challenging jobs.
  • Able to work well independently and also in a team by helping to troubleshoot technology and business related problems

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 7.1/8.x/9.1/9.5, PowerExchange 8.x

Databases: Oracle 8i/9i/10g/11g, SQL Server 2005/2008, Netezza, DB2 7.0/8.0

Programming: SQL, PL/SQL, C, C++, Java, Java Script, Perl, Unix Shell Scripting

Date Modeling: ERwin 7.0/7.2, Visio

Environment: UNIX, Linux, Windows 2000/XP/Vista/7

Generic Tools: SQL* Plus, SQL Loader, TOAD, FTP Tools, Serena Version Manager, Microsoft’s Word / Excel / Visio / PowerPoint

PROFESSIONAL EXPERIENCE

Confidential

Sr. Informatica developer

Responsibilities:

  • Involved in various design discussions and requirement discussions with the end users.
  • Involved in identifying all the Data Models for all the applications.
  • Involved in creating reusable control table driven ETL Architecture.
  • Created ETL Process control architecture and integrated the same architecture with reporting applications.
  • Worked closely with all the application owners to know about the functionality of these applications in a very detail level to understand the data models being used by them.
  • Studied various Planning applications to understand the source data for FDW.
  • Created ETL Mapping specifications using functional specifications.
  • Created Technical Specifications document based on functional specifications.
  • Developed complex ETL mappings that involve parallel processing of multiple instances basing on certain parameters in control table.
  • Created complex Mappings using different Transformations like Filter, Router, Joiner, Connected & Unconnected Lookups, Sorter, Aggregator and Sequence Generator to pipeline data to Data Warehouse
  • Developed workflows with multiple sessions and instances for same subject area.
  • Created post-sessions and pre-sessions for all the sessions to update the etl process table which is used to understand the current status of ETLs.
  • Developed Dimension mappings that load the dimension from landing zone table involving Type 2 and Type 1 transformations.
  • Involved in tuning Informatica ETL mappings analyzing them thoroughly.
  • Involved in identifying various bottle necks at different levels (database, mapping, session, and workflow) and came up with solution to improve the performance.
  • Created table partitions for each of the country, operating company and period combination which enables faster retrieval of data.
  • Created a Stored Procedure to do a partition swap from staging to FDW which is called in ETL to load the Fact data for various tables which are passed as parameters to the stored procedure.
  • Create Oracle Stored Procedures to implement the ETL Process control logic.
  • Created Oracle Stored Procedures for Segment Dimension which holds all the dimension keys.
  • Designed Oracle views that generate these extracts for upstream applications and created ETLs to read these views and drop them in a file.
  • Created a model for Audit mechanism and included the Audit counts in each of the ETLs to verify the source and target counts and sums.
  • Identified various extracts needed for upstream reporting applications and planning applications and designed structures for the same.
  • Documented all the ETL and Oracle Procedures developed.
  • Created Unit Test plans for various ETLs developed.
  • Performed Unit Testing and Integration testing for the ETL’s .
  • Reviewed the Code, Design and Test plans as appropriated throughout project lifecycle

Environment: Informatica Power Center 9.5, Oracle 11g, Toad for Oracle, SQL, PL/SQL, Windows 7

Confidential, Bethpage NY

Senior Informatica Developer

Responsibilities:

  • Prepared design specification documents as per the inputs received from the Architect and the Business Analyst.
  • Extracted data from Heterogeneous source systems like Oracle, SQL Server and Flat files with fixed width and delimited.
  • Involved in Cleansing and Extraction of data and defined quality process for the warehouse
  • Developed Informatica ETL mappings, sessions and workflows based on the technical specification document.
  • Created Mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator
  • Designed and developed the logic for handling Slowly Changing Dimension tables load by flagging the record using update strategy for populating the desired
  • Developed reusable mapplets and transformations for reusable business calculations
  • Used exception handling logic in all mappings to handle the null values or rejected rows
  • Tuned the ETL components to gain the performance and to avoid business continuity.
  • Worked with Persistent Caches for Conformed Dimensions for the better performance and faster data load to the data warehouse
  • Involved in performance tuning and optimization of Informatica Mappings and Sessions using partitions and data/index cache to manage very large volume of data
  • Performed query overrides in Lookup Transformation as and when required to improve the performance of the Mappings
  • Developed Oracle PL/SQL components for row level processing.
  • Dropped & recreated Indexes before & after loading through pre-SQL& post-SQL
  • Developed Unix scripts for processing Flat files.
  • Scheduled the jobs in the Appworx.
  • Prepared Test Data and loaded it for Testing, Error handling and Analysis
  • Prepared the test cases and tested the ETL components for end to end process.
  • Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for Unit Testing, Systems Testing, expected results
  • Created an Issue Log to identify the errors and used it for preventing any such errors in future development works.
  • Worked on the production code fixes and data fixes
  • Responsible to troubleshoot the problems by monitoring all the Sessions that are scheduled, completed, running and also used Debugger for complex problem troubleshooting.
  • Worked with Application support team in the deployment of the code to UAT and Production environments
  • Involved in production support working with various mitigation tickets created while the users working to retrieve the database.
  • Worked as a part of a production support team and provided 24 x 7 supports
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Mappings

Environment: Informatica Power Center 8.6, Oracle 11g, Toad for Oracle, MS SQL Server 2008, AIX server, Unix, Appworx, Winscp, Putty, Serena Version Manager

Confidential, Richmond VA

Informatica Developer

Responsibilities:

  • Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.
  • Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin.
  • Involved in converting the Data Mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
  • Involved in designing and customizing of Data Models for Data Mart supporting data from multiple sources on real time.
  • Designed the Data Mart defining Entities, Attributes and Relationships between them.
  • Extensively used Erwin tool in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible.
  • Defined various Facts and Dimensions in the Data Mart including Fact less Facts, Aggregate and Summary Facts.
  • Reviewed Source Systems and proposed data acquisition strategy.
  • Designed and developed Informatica Mappings to load data from Source Systems to ODS and then to Data Mart.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files (fixed width), and Excel files to staging database and from staging to the target Teradata Warehouse database
  • Extensively used Power Center to design multiple mappings with embedded business logic.
  • Created complex mappings using Transformations like Connected / Unconnected Lookup, Joiner, Router, Rank, Sorter, Aggregator and Source Qualifier Transformations
  • Created Mapplet and used them in different Mappings.
  • Implemented CDC (Change Data Capture) using Informatica PowerExchange
  • Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.
  • Improved the performance of the ETL by indexing and caching
  • Designed and developed Oracle PL/SQL scripts for Data Import/Export.
  • Extracted and Loaded data using different tools in Teradata like BTEQ, Fastload, Multiload, FastExport
  • Worked with various Teradata utilities as external loaders in informatica.
  • Developed Unix Shell Scripts for automating the execution of workflows.
  • Created various UNIX Shell Scripts for scheduling various Data Cleansing scripts and loading process.
  • Maintained the batch processes using Unix Shell Scripts.
  • Designed and deployed UNIX Shell Scripts.
  • Managed Change control implementation and coordinating daily, monthly releases and reruns.
  • Responsible for loading data into warehouse from different sources using SQL Loader to load millions of records.
  • Involved in migration of Mappings and Sessions from development repository to production repository
  • Provided Production Support by executing the Sessions, diagnose problems and fix the Mappings for changes in business logic.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Environment: Informatica Power Center 8.6, PowerExchange, Oracle 10g, Teradata, Flat Files, ERwin, MS Visio, UNIX AIX, Unix Shell Scripting, SQL, PL/SQL, SQL Loader

Confidential, Concord, CA

Informatica Developer

Responsibilities:

  • Involved in understanding requirements, analyze new and current systems to quickly identify required Sources and Targets
  • Analyzed business requirements to build a Data Mart for various business processes conformed to the business rules
  • Analyzed the Functional Specs provided by the Data Architect
  • Designed Technical specification document based on the FSD
  • Created Mapping Design document based on FSD
  • Extensively used Informatica for extracting, transforming, loading databases from Oracle, flat files and DB2
  • Created PL/SQL Procedures and Functions for scrubbing the data
  • Developed Transformation logic and designed various complex Mappings in the Designer
  • Designed and developed various mappings in Mapping Designer, sessions & workflows in Workflow Manager to extract data from flat files, Oracle and DB2 sources and load to Oracle.
  • Created, launched & scheduled sessions. Configured email notification
  • Implemented the best practices for the creation of Mappings, sessions, Workflows and performance optimization.
  • Implemented Pipeline partitioning to improve session performance.
  • Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data
  • Wrote Queries, Procedures and Functions that are used as part of different application modules
  • Performed Data Quality Analysis to determine cleansing requirements
  • Created scripts to update the parameter files to establish delta extraction
  • Wrote various shell scripts for pre-processing of data and scheduling of jobs
  • Wrote Unix Shell Scripts for Informatica Pre-Session, Post-Session to schedule the Infa jobs (work flows)
  • Prepared Test Data and loaded it for Testing, Error handling and Analysis
  • Involved in preparing ETL design documents and Unit Test Plans for Mappings.
  • Prepared the code migration document and worked with release team in Migrating the code (Informatica Objects, Unix Scripts) from Development to Test and Production environments
  • Interacted with Users for analyzing various Reports.
  • Involved in answering the change request as part of the interaction testing
  • Deployed ETL components code into multiple environments as per the approval received
  • Provided Production Support by executing the sessions, diagnose problems and fix the mappings for changes in business logic.

Environment: Informatica PowerCenter 8.1, Windows XP, Oracle 9i, DB2 SQL, PL/SQL, TOAD, Delimited Flat Files, Unix Shell Scripting.

Confidential, Seattle, WA

Informatica Developer

Responsibilities:

  • Worked closely with BA Team to understand the Business Requirements.
  • Involved in Requirement Analysis and documenting Functional & Technical specifications.
  • Developed Data Mapping spreadsheet to present the transformation process.
  • Installed and configured Informatica 7.1
  • Developed complex Informatica mappings, mapplets and worklets to load data into the Staging area (Inbound and Outbound).
  • Created Workflows & Worklets using Session, Command and Email tasks and pre- and post- Session scripts as required
  • Used Informatica Workflow Manager to create, schedule, execute and Monitor the Sessions and Workflows.
  • Responsible for monitoring scheduled, running, completed and failed sessions.
  • Involved in debugging the failed mappings and developing error-handling methods.
  • Used Mapping Debugger to debug the Mapping and correct them
  • Tuned performance of Informatica session for large data files by increasing data cache size and target based commit interval
  • Optimized the Cache for Dynamic, Static and Persistent Cache Lookup Transformations.
  • Involved in writing stored procedures
  • Validated Views as part of post-SQL
  • Worked on performance tuning of SQL and Mappings by usage of SQL Overrides in Lookups, Source Filter in Source Qualifier
  • Involved in code readability and code reviews.
  • Developed Korn shell scripts for Informatica pre-Session, post Session procedures
  • Involved in loading data to staging areas.
  • Used Maestro to schedule jobs.
  • Tested all the business application rules with test & live data and automated, monitored the Sessions using Workflow Monitor
  • Done extensive testing and have written queries in SQL to ensure the loading of the data
  • Coordinated the QA team with development team for the new release of the software.
  • Coordinated the QA and the BA teams with development team for the new release.
  • Documented processing times for each module, developed test cases and used them to run through each process.
  • Tuned the matching parameters based on test results.
  • Provided production support.

Environment: Informatica Power Center 7.1, Oracle 9i, SQL Server 2005, DB2, Flat files, MS Excel, SQL, PL/SQL, TOAD, UNIX, Business Objects 5.1.

Confidential

Database Developer

Responsibilities:

  • Design of the new database for the application. Installing the Oracle server, client.
  • Allocated system storage and planning future storage requirements.
  • Created primary database storage structures (Table space) with appropriate placement of data files for maximum efficiency and performance.
  • Creating primary objects (tables, views and indexes) as required by the application design.
  • Modifyied the Database structure as necessary.
  • Developed oracle PL/SQL, DDLs, and Stored Procedures and worked on performance and fine Tuning of SQL & PL/SQL stored procedures
  • Controlled and monitored user access to the Database. Monitoring and optimizing database performance.
  • Planned for backup and recovery of database information.

Environment: Oracle 8i, PL/SQL, Developer 2000, Shell Scripting, UNIX, Windows NT

Confidential

Oracle Developer

Responsibilities:

  • Performed data query, extraction, compilation, and reporting tasks.
  • Managed, updated and manipulated report orientation and structures with the use of advanced Excel functions including Pivot Tables and V-Lookups.
  • Generated weekly, monthly, and quarterly reports necessary in maintaining a good and balanced financial statement.
  • Handled data collection, analysis, interpretation and presentation to management and other team members gathered via a wide range of available means and methods from users and business partners associated with Supply Chain activities.
  • Researched for new means of qualifying and obtaining data and methods of utilizing analytical tools effectively to be used for systems development and improvement.
Environment: Oracle 8i, Toad, MS Excel, PowerPoint

We'd love your feedback!