We provide IT Staff Augmentation Services!

Informatica Resume

4.00/5 (Submit Your Rating)

Portland, OR

SUMMARY

  • Over8 years of total IT experience and Technical proficiency in the Data Warehousing teamed with Data Analysis, Data Modeling, Business Requirements Analysis, Application Design, Development & testing, Data profiling, data standardization & Quality Control and full life cycle Implementation of Data Warehouse.
  • Proficiency in interviewing stakeholders, requirements analysis and realization using Use Cases, deriving testable functional / non-functional system requirements, providing strategic and tactical inputs and coordinating with business functions and technical project teams from conception through deployment.
  • Involved in Requirement gathering, Data Analysis, Application Design, Data Modeling, Application Development, Implementations and Data Quality in different Data warehouse Projects.
  • Experience in creating efficient and effective High and Low level ETL documentation both for business as well as development community.
  • Evaluated numerous complex system problems and user requirements. Ability to develop structured and modular solutions.
  • Experience in designing complex ETL/ELT solutions for Data Warehouse Programmes.
  • Experience in both integration and reporting areas of Data Warehouse using informatica, Cognos, Business Objects, Teradata and Oracle.
  • Erudite in Preparing word documentation and Visio diagrams to depict business processes better.
  • Strong Data Warehousing experience in Application development & Quality Assurance testing using Informatica Power Center 8.6/7.1/6.2/5.1(Source Analyzer, Data Warehousing Designer, Mapping Designer, Mapplet Designer, Transformation developer), Informatica Power Mart, Power Connect 6.2, Power Exchange, OLAP, OLTP.
  • Experience in designing error and exception handling procedures to identify, record and report errors.
  • Erudite in writing SQL and developing view and stored procedures in teradata.
  • Extensive Experience in developing and performance tuning of informatica mappings.
  • Erudite in Using Pushdown Optimization feature of Informatica.
  • Experience in Configuration and working on developing maps using Power exchange for SAP& DB2.
  • Extensive knowledge with Teradata SQL Assistant, PUMP, Fast Export, Fast Load, Multiload, BTEQ, Coded complex scripts for the Teradata Utilities and finely tuned the queries to enhance performance.
  • Extensive knowledge of Teradata Analyst pack including Visual explains, System Emulation tools and Teradata Manager and PMON.
  • Erudite in development of the ETL maps for Data warehouse loading.
  • Experience in working in both development as well as production support stages of the project.
  • Worked extensively on performance tuning of ETL maps and considerably decreased the ETL load timings
  • Involved in the designing and Building of Business Objects Universes from different data sources like Oracle/SQL Server.
  • Created the report mockups and prepared the report design documents.
  • Hands on experience of using MS Office Suite, MS Project, MS Visio, Rational tools, Erwin.
  • Experience in coordinating with the configuration and change management groups for Clear Case Version Control and Build Making Process.
  • Excellent analytical, problem solving skills and a motivated team player with good communication and inter-personal skills.
  • Extensively worked with various systems like Oracle, Teradata, MS Access, Flat files, XML and Mainframe, SAP BW,SQL Server.
  • Sound understanding of the value and meaning of information, data quality and metadata

SKILLS

Data warehousing, business intelligence, decision support/knowledge management, information management, ETL Architectures, Relational Data Modeling and Database Design

Databases

GUIs/Reporting Tools/CASE Tools

Erwin, Power Designer, Business Objects 5.0/6.0,Control-M,AUTOSYS.

Hardware

NCR World Mark servers, Sun E 10000server

Languages

C, C++, SQL, HTML

Methodologies

Star schema database design, structured systems analysis and design

Operating Systems

Windows NT, UNIX, NCR-UX ,AIX MS-DOS

Warehousing Tools

INFORMATICA Power Center 8.6/8.1/7.1/6.1/5.1.2, Power Mart 5.1/4.7, Power Connect 1.5, POWER EXCHANGE, Oracle 9i/10G, TOAD, PUTTY, Edit Plus, Merant Version Manager, COGNOS, BUSINESS OBJECTS. VSAM Flat Files, XML with Normalizer.
Teradata Utilities like BTEQ, FAST EXPORT, FAST LOAD, MULTILOAD, TPUMP, TERADATA QUERY MANAGER, PRIORITY SCHEDULER, VISUAL EXPLAIN, SQL ASSISTANT

PROFESSIONAL EXPERIENCE

Confidential,Portland OR ,Jun’10-Till date

SrInformatica Consultant

Responsibilities:

  • Interacted with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project.
  • Developed FRD (Functional requirement Document) and data architecture document and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis.
  • Worked on dimensional modeling to design and develop STAR schemas by identifying the facts and dimensions. Designed logical models as per business requirements using Erwin.
  • Designed and Developed ETL mappings using transformation logic for extracting the data from various sources systems.
  • Extensively used Informatica Power Center and created mappings using transformations like SourceQualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
  • Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Created, Tested and debugged the Stored Procedures, Functions, Packages, Cursors and triggers using PL/SQL developer.
  • Queried against the Oracle 10g Database to retrieve Millions of records using PLSQL.
  • Extensively used PL/SQL programming in backend and front-end functions, procedures, packages to implement business rules, security efficiently and test cases for the project.
  • Design and Development of data validation, test cases, and error control routines using PL/SQL, SQL. Involved extensively in designing & coding.
  • Involved in PL/SQL statements performance tuning.
  • Used the feature EXPLAIN PLAN to find out the bottlenecks in a given Query, thus improving the performance of the job.
  • Reviewed the Testing progress and issues to be resolved by conducting walkthroughs.
  • Involved in writing Unit test cases,Unittesting, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
  • Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.
  • Gather Requirements for Scheduling Cognos impromptu reports.Involved in Development,Testing,Scheduling& maintenance of the reports.
  • Automated UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency.
  • Made use of Post-Session success and Post-Session failure commands in the Session task to execute scripts needed for clean up and update purposes.
  • Responsible for regression testing ETL jobs before test to production migration.
  • Extensively used Powerexchange to source mainframe COBOL copy books & VSAM files.
  • Extensively used Normalizer to process mainframe VSAM & COBOL files.
  • Involved in migration of ETL code from development repository to testing repository and then to production repository.
  • Handled the dead lock errors that occurred due to dependency of jobs running simultaneously.
  • Provided support for daily and weekly batch loads.
  • Prepared Run book for the daily batch loads giving the job dependencies and how to restart a job when it fails for ease of handling job failures during loads.
  • Conducted meetings for every deployment to make sure the job schedules and dependencies are developed in such a way that we are not missing the SLA on a day to day basis.
  • Involved in facilitating the allocation by prioritizing the data issues raised by end user based on the criticality.
  • Resolved and closed the Production tickets generated due to failure of daily incremental production jobs
  • Functioned as the primary liaison between the business line, operations, and the technical areas throughout the project cycle.
  • Provided data loading, monitoring, system support and general trouble shooting necessary for all the workflows involved in the application during its production support phase.

Environment: Informatica Power Center 8.6.1, Cognos Impromptu 7.4, Oracle 11i/10g,DB2, Teradata,SQL Server, XML Files, TOAD,Teradat SQL Assistant 7.2, SQL, PL/SQL, Windows XP, UNIX,Control-M.

Confidential,Burlington MA Dec’07-May’10 Informatica Consultant

Responsibilities:

  • Worked on dimensional modeling to design and develop STAR schemas by identifying the facts and dimensions. Designed logical models (relationship, cardinality, attributes, and candidate keys) as per business requirements using Erwin.
  • Involved in designing the process flow for extracting the data across various source systems.
  • Worked with Reporting team to get to an agreement on what level of complexity to be handled by ETL and reporting side.
  • Extensively used Informatica Power Center and created mappings using transformations like SourceQualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
  • Developed and scheduled Workflows using task developer, worklet designer and workflow designer in Workflow manager and monitored the results in Workflow monitor.
  • Developed sessions using different types of partitions like round robin, hash key portioning for better performance.
  • Involved in Developing UNIX shell Scripts to generate parameter files and executed oracle procedures as batch jobs.
  • Designed and implemented stored procedures, views and triggers for automating tasks in PL/SQL.
  • Created permanent tables, temporary tables, table variables and indexes.
  • Created triggers to audit oracle data changes and to control data manipulation.
  • Extensively worked on VSAM files using Normalizer for mainframe .
  • Created Complex Stored Procedures, User Defined Functions, Cursors, Views, and Alter, Insert Update statements for the User Interface, Reports and Packages.
  • Used Query Profiler to optimize SQL queries and Developed and executed several optimized queries in SQL.
  • Worked on improving the performance of the system.
  • Developed PL/SQL procedures, functions to facilitate specific requirement.
  • Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts.
  • Automated UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency.
  • Performed Unit Testing and verified the data and performed Error Handling on sessions in Workflow, which was extracted from different source systems according to the user requirements.
  • Created scripts for Batch test and set the required options for overnight, automated execution of test scripts
  • Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation, variables.
  • Responsible for regression testing ETL jobs before test to production migration.
  • Provided support in handling file validation errors and database down and connectivity errors.
  • Provided support for daily and weekly batch loads.
  • Investigated and fixed problems encountered in the production environment on a day to day basis.
  • Responsible for daily internal and external SLA, communicating in advance of any SLA miss to both technical and functional audience.
  • Prepared Run book for the daily batch loads giving the job dependencies and how to restart a job when it fails for ease of handling job failures during loads.
  • Trouble shooted the data and performance issues raised by user community, designed, developed and deployed the code to fix them.

Environment: Informatica Power Center 8.6, Business Objects XI, Oracle 11i/10g,DB2, MS SQL Server, XML Files, TOAD, SQL, PL/SQL, Windows XP, UNIX,Autosys.

Confidential,Pune India May’05-Oct’07

Informatica/Teradata Consultant
Worked as Informatica/Teradata Consultant for Integrated planning Information Management Team.
Responsibilities:

  • Prepared ETL Technical Specifications documents, Worked along with Business System Analysts to build optimal ETL Design.
  • Worked along with data modelers to design staging layer for the Planning Application.
  • Worked along with Project Managers in estimating efforts required for ETL design and development.
  • Worked on configuration of Power exchange with SAP BW.
  • Developed Informatica Mappings using Informatica 8.6 with teradata as Target Database.
  • Extensively Used Informatica Pushdown Optimization and Grid features to shorten the data load time.
  • Worked on Designing and developing ETL solutions using Teradata Utilities like TPUMP, FAST LOAD and MLOAD.
  • Worked on developing Informatica Mappings using different Source Systems like Oracle, SQL Server, SAP BW, teradata etc.
  • Worked along with Architects and designed detail level of data Flow diagrams between the systems.
  • Extensively worked on performance tuning of the push down queries and Teradata Queries.
  • Designed parallel processing for ETL mappings to decrease the load timings.
  • Worked extensively on informatica versioning, Lables and deployment process.
  • Worked as Release manager for team for moving objects across environments.
  • Extensively worked on writing view for business user or consumers of the data from other teams.
  • Designed Informatica Solution resulted in maintenance of less number of informatica objects and overhead costs.
  • Designed delta or Incremental Load process for all the ETL jobs.
  • Worked with data modelers in identifying columns required for statistics collection in the database.
  • Worked on Data Analyzer to Automate Code Review Process.
  • Designed and developed Batch Tables to capture batch details of the jobs.
  • Worked with BSA’s and ISA’s to develop Test Plan strategy for QA stage of the development process.
  • Worked with Project Managers in Preparing build plans for implementation of informatica and teradata Objects in QA and Production.
  • Analysis of the source system to determine elements needed for ETL designs and did extensive data profiling for ETL maps.
  • Designed ETL Load strategies for Initial and Incremental Data load.
  • Worked on Informatica Utilities - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer & Transformation Developer
  • Used BTEQ, SQL Assistant (Query man) front-end tool to issue SQL commands matching the business requirement to Teradata RDBMS
  • Created BTeq scripts for data extraction and manipulation.
  • Created Fast load, Multi load scripts for data loading.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain. Created several custom tables, views and macros to support reporting and analytic requirements.
  • Involved in creating secondary, join indexes for efficient access of data Based on the requirements
  • Helped DBA team while creating Targets in selecting the Primary Index columns by analyzing the data.
  • Extensively worked on Informatica version control ,relay and Quality centre tools for defect tracking.

Environment: Teradata V2R12, Oracle 10g, Informatica 8.5/8.6, Power exchange for SAP,DB2,Cognos8, TOAD, ERWIN, Edit Plus, Relay, Autosys, Unix, SQL Server 2005

Confidential,HyderabadIndiaJuly\'03-Apr’05 ETL Consultant

As aETL Consultant for Process Management, I have designed various ETL strategies and maps for subject areas like Accounts payable, Order and Sales, Portfolio risk management, Inventory Control Management.
Responsibilities:

  • Analyzed business process and gathered core business requirements, Interacted with business analysts and end users.
  • Understood various source systems and as Is processes and data elements required for building metrics like Accounts payable excellence, Self service purchasing metrics.
  • Worked with data Architects in maintaining data quality by defining ETL strategy for maintaing RI constraints in the database.
  • Analyzed business requirements and worked closely with the various application teams and business teams to develop ETL procedures that are consistent across all application and systems.
  • Prepared a handbook of standards for Informatica code development & handled the enterprise level data cleansing with support from the subject matter Experts and stake holders
  • Responsible in Process Strategies involving in the conversion lifecycle. Laying down the ETL architecture, Process flow, data integration, data validations and error handling.
  • Partitioned Sessions for concurrent loading of data into the target tables.
  • Involved in identifying the bottle necks and tuning to improve the Performance.
  • Worked on Business objects in designing Universe and Reports for business users
  • Created different types of reports like Cross Tab, Charts, and Master Detail using multiple data sources.
  • Created reports with drill through and drill by functionality.

Environment: Windows, Informatica 7, Edit Plus, IBM DB2 8.1Mainframe, VSAM Flat file Sources, SQL/PLSQL, Oracle, Unified Process (RUP), UML, and ERWIN.,Microsoft SQL Server 2000

EDUCATION

Bachelor of Engineering in Electronics & Communication

We'd love your feedback!