We provide IT Staff Augmentation Services!

Sr. Informatica Lead Developer Resume

4.00/5 (Submit Your Rating)

TX

SUMMARY:

  • 9+ Years of IT industry Expertise in Data Warehouse/Data Mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Participate in Agile team activities including daily standups, sprint planning, PSI planning, product demos, etc.
  • Highly proficient in ETL using Informatica Power Center 7.x,8.x, 9.1, 9.5, 9.6 and reporting using SAP Business Objects XI R2, 3.1 & 4.0,Xcelsius,Cognos reporting.
  • Experience in performance tuning of Informatica mappings, and SQL query tuning.
  • Experience in scheduling the Informatica workflows using Cntrl - M, Informatica Scheduler and ESP.
  • Working with databases like IBM UDB DB2, Oracle, MS SQL Server, Teradata, Netezza and JDE.
  • Experience in working with UNIX, LINUXand other environments.
  • SAP Business Objects and Xcelsius: 3+ years of SAP Business Objects 4.0/XIR3/XIR2 experience in Central Management Console, Central Configuration Manager, Designer, Web Intelligence, Desktop Intelligence, Import Wizard, Infoview, Dashboard Manager, and Scorecards.
  • Xcelsius 2008/4.5 and Xcelsius Data Visualization Live Office, QAAWS, Import Wizard, and Report Conversion Tool and Services. Building dashboards against various data sources including Oracle, DB2, SQL Server etc.
  • Expertise in Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, Star Schema, Snow Flake Schema, Fact Table, Dimension Table, and Dimension Data Modeling using Erwin.
  • Domain Experience: Diversified experience in multiple domains like Oil & Gas, Manufacturing, Insurance, Federal, Finance, and Pharmaceutical.

TECHNICAL SKILLS

Operating System: Windows 2000, XP/NT,HP-UX, UNIX and DOS.

Languages: C, SQL, SQL*PLUS, PL/SQL, Shell Programming, COBOL.

RDBMS &other Tools: Oracle 11g/10g/9i, SQL Server, Teradata, DB2, TOAD, IBM Data Studio, SQL Developer.etc

ETL, BI Tools: Informatica PowerCenter 7.1/8.5/9.1, 9.5, 9.6 Power Exchange, Business Objects XI R2,3.1, BI4.0, Xcelsius, OBIEEand Cognos Reporting.

Packages/Validation Tools: DVO, DataFlux, Pentaho Data Integration, PentahoSpoon,IDQ 8.6, Erwin 4.2,MS Office, RUMBAand Crystal Reports 6.

PROFESSIONAL EXPERIENCE

Confidential, TX

Sr. Informatica Lead Developer

Responsibilities:

  • Working with customer and technical resources to address business information and application needs.
  • Interact with the Application users to get a brief knowledge of business logics.
  • Giving estimates to customer and application users.
  • Involve in Data validation, Bug Fix, determining the bottlenecks and fixing the bottlenecks.
  • Conduct code walkthroughs and review peer code and documentation.
  • Extract data from flat files,XML files,JDE, Oracle and other heterogeneous sources and load the data into respective tables and files.
  • Developed medium to complex level informatica mapping and tuning the mappings and workflows for optimum performance.
  • Extensively used Mapping Variables, Mapping Parameters, and Workflow variables for full load and incremental load.
  • Extensively used Informatica Transformation like Router, Filter, Lookup, Joiner, Aggregator, Sorter, Rank, Stored Procedure Transformation, Mapplets etc. and all transformation properties.
  • Improved performance using Informatica Partitions and other Performance Tuning techniques. Developed re-usable components in Informatica and UNIX.
  • Developed Informatica workflows/worklets/sessions associated with the mappings using Workflow Manager.
  • Distribution of work to offshore team and follow-ups.
  • Participating in daily, weekly & monthly standups with offshore & onsite.
  • Wrote Unit Test cases and executed test scripts successfully.
  • Validate the ongoing data synchronization process using DVO to ensure the data in source and target system are synchronized.
  • Extensively working on code migration from DEV, QA and PROD.
  • Used Informatica scheduler for scheduling jobs in production and provided onsite Level-3 ETL production support.

Confidential, IL

Sr. Informatica Developer/ On-Site Coordinator

Responsibilities:

  • Interact with the Product Owners to get a brief knowledge of business logics.
  • Participate in Agile team activities including daily standups, sprint planning and product demos, etc. Worked with the Product Owner to understand requirements and translate them into the design, development and implementation of ETL process using Informatica.
  • Working with business and technical resources to address business information and application needs.
  • Developed SQL and DB2 procedures, packages and functions to process the data for CGR Project (Complete Goods Reporting).
  • Involve in Data validating, Data integrity, performances related to DB, Field size validation, check Constraints and Data Manipulation and updates by using SQL.
  • Extract data from flat files, DB2, SAP and to load the data into respective tables.
  • Implementing various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy for CGR.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD, Type 2 SCD and fact implementation.
  • Extensively used Mapping Variables, Mapping Parameters, and Parameter Files for the capturing delta loads.
  • Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation.
  • Used cvs files and tables as sources and loaded the data into relational tables.
  • Created and Configured Workflows, Worklets, and Sessions to load the data to Netezza tables using Informatica PowerCenter.
  • Working with the business analyst team to analyze the final data and fixed the issues if any.
  • Used pushdown optimization to achieve good performance in loading data into Netezza.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Identified and eliminated duplicates in datasets thorough IDQ 8.6.1 components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Conduct code walkthroughs and review peer code and documentation.
  • Writing Unix scrip for handling the source data and mapping.
  • Validate the ongoing data synchronization process using validation tools to ensure the data in source and target system are synchronized.
  • Extensively used ESP tool for scheduling Informatica batch jobs and provided level-3 production support on rotation.

Confidential, IL

Sr. Informatica Developer

Responsibilities:

  • Involved in requirement gathering with different DC for business requirements and functional specifications.
  • Extensively worked on analysis, design, ETL development, Unit testing, QA/Production deploymentsand production support.
  • Worked with developers at onsite/offshore and clearly communicated deliverables to offshore team on Informatica development and documentation.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Implement best practices to maintain optimal performance.
  • Conduct code walkthroughs and review peer code and documentation.
  • Worked with the business analyst team to analyze the final data and fixed the issues if any.
  • Writing medium to complex DB2 queries. Working with DB admin to resolve the performance issues.
  • Developed medium to complex level informatica mapping and tuning the mappings and workflows for optimum performance.
  • Extensively worked on data validation tools and developed Dataflux/DVO jobs for comparing Source (DB2) and Target (Salseforce) to ensure that data is in sink and the data is populated as according to business requirements.
  • Extensively worked on salesforce API web connections for connecting Informatica & SAS DataFlux.
  • Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
  • Implemented the informatica CDC for capturing the changes done to the legacy systems.
  • Extensively worked on Pentaho Data Integration and Pentaho Spoon for extracting data from DB2, Salesforce and loaded data into oracle.
  • Experience in accessing the mainframe using RUMBA.
  • Developing Teradata Utility scripts like FastLoad, MultiLoad to load data from various source systems to Teradata.
  • Worked on Informatica IDQ 8.6 to complete initial dataprofiling and matching/removing duplicate data.
  • Experience in working with different Insurance domains- Auto, Home & Property, Commercial and Life.
  • Developed reusable transformations and reusable mapplets.
  • Given 24/7 production support and met SLA’s.

Confidential, VA

Sr. SAP Business ObjectsDeveloper

Responsibilities:

  • This project is a proof of concept (POC) project for the automation of 2013 budget book for the office of Planning and Budget (OPB).
  • I have involved in different tasks starting from requirements gathering, defining functional specifications, analysis, and estimation of work for the proto type development, presenting it to the stake holders and documentation etc.
  • Designed and developed Universe and web-I reports, used the web-I report parts into the word document using live office.
  • Worked on the performance improvement of the different reports.
  • Created user groups and provided them access accordingly.
  • Helped power users/SME to install the Client version of Business Objects including the SAP BO Live Office 4.0.
  • Provided multiple live/web-ex training sessions for different users and provided project demo’s for the stake holders.
  • Design universes and reports as per the PTO guide lines.
  • Worked with ETL team and other team members while loading the data into “Data warehouse”.
  • Responsible for designing and developing Automated reports using live office.
  • Ensuring customer satisfaction, I have timely delivered the deliverables for each milestone.
  • Enhanced and improved the existing reports as per the customer requirements.
  • Designed and developed dashboards and scorecards using SAP BO Dashboard Design 4.0 for higher management.
  • Developed new drill-down/up reports and input control enabled reports from the existing universes for the data analysis purposes.
  • Performed unit testing, and wrote test scripts for the regression testing and performance testing for the testing team.

Confidential, MD

Sr. Informatica/SAP Business ObjectsDeveloper

Responsibilities:

  • Extracted data from various heterogeneous sources DB2 UDB, Oracle and Flat files, XML files.
  • Written medium to complex queries and tuned the queries for optimum performance and related to data model.
  • Worked with power center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
  • Extensively used Informatica Transformation like Source Qualifier, Rank, SQL, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorteretc. and all transformation properties.
  • Solid Expertise in using both connected and unconnected Lookup Transformations.
  • Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
  • Extensively worked on XML files as Source and Target.
  • Used transformations like XML Generator and XML Parser to transform XML files.
  • Worked onInformatica debugger wizard to troubleshoot data and error conditions.
  • Responsible for best practices like naming conventions, and performance tuning.
  • Developed Reusable Transformations and Reusable Mapplets.
  • Extensively used Various Data Cleansing and Data Conversion Functions in various Transformations.
  • Involved in Migrating SAPBusiness Objectsreports from XI R2 to BO XI 3.0.
  • Involved in developing, designing and maintaining new and existing universes by creating joins and cardinalities, responsible for creation of the universe with Classes, Objects and Condition Objects.
  • Involved in creation of SAP Business Objects XI R3.0 Desktop Intelligence and Web Intelligence Reports with different functionalities like Dynamic cascading prompts, drill down, linking reports, charts and graphs, ranking and alerts based on user feedback.
  • Used multi-dimensional analysis (Slice & Dice and Drill Functions) to organize the data along a combination of Dimensions and Drill-down Hierarchies giving the ability to the end-users to view the data.
  • Created graphical representation of reports such as Bar charts, 3D charts and Pie charts etc.
  • Created Web Intelligence reports with multiple data providers and synchronized the data using merge dimension option inSAP Business Objects XI R3.0.
  • Responsible for migrate the code using deployment groups across various Instances.
  • Optimized SQL queries for better performance.
  • Responsible for Unit Testing of Mappings and Workflows.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD, Type 2 SCD and fact implementation.

Confidential, FL

Informatica Developer

Responsibilities:

  • Analyzed the logical model of the databases and normalizing it when necessary.
  • Involved in identification of the fact and dimension tables.
  • Extensively used Informatica-Power Center 7.1 for extracting, transforming and loading into different databases.
  • Wrote PL/SQL stored procedures and triggers for implementing business rules and transformations
  • Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.
  • Created Source and Target Definitions in the repository using Informatica Designer - Source Analyzer and Warehouse Designer.
  • Worked extensively on different types of transformations like Source qualifier, Expression, Filter, Aggregator, Rank, Lookup, Stored procedure, Sequence generator.
  • Replicated operational tables into staging tables, to transform and load data into the enterprise data warehouse using Informatica.
  • Created and scheduled Worklets, configured email notifications. Set up Workflow to schedule the loads at required frequency using Power Center Workflow Manager, Generated completion messages and status reports using Workflow Manager.
  • Involved in Performance Tuning at various levels including Target, Source, Mapping, and Session for large data files.
  • Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.
  • Documented Data Mappings/ Transformations as per the business requirement.
  • Performed testing, knowledge transfer and mentored other team members

Confidential

ETL Programmer

Responsibilities:

  • Analyzed the systems, met with end users and business units in order to define the requirements.
  • Responsible for developing technical specifications from business requirements.
  • Extracted high volume of data from DB2tables, Flat Files and populated into Oracle with the help of SAS Data Integration-ETL.
  • Create High level, low level and detail technical design specifications for the SAS ETL.
  • Understanding of the business requirement of external source systems and perform cleaning, standardization and generate match case and clusters.
  • Wrote PL/SQL procedures, functions, triggers and packages for processing business logic in the database.
  • Experience in Query Optimization, Performance Tuning and DBA solutions and implementation experience in complete System Development Life Cycle.
  • Designed coding specification and documentation.
  • Involved in ETL coding using PL/SQL in order to meet requirements for Extract, transformation, cleansing and loading of data from source to target data structures.
  • Involved in the continuous enhancements and fixing of production problems.
  • Designed, implemented and tuned interfaces and batch jobs using PL/SQL. Involved in data replication and high availability design scenarios with Oracle Streams.

We'd love your feedback!