We provide IT Staff Augmentation Services!

Senior Informatica Etl Bi Developer Resume

5.00/5 (Submit Your Rating)

Columbus, OH

SUMMARY

  • 8 1/2 years of experience in designing, developing, maintaining and building large business applications such as data migration, integration, conversion, data warehouse and Testing.
  • Expert in all phases of Software development life cycle(SDLC) - Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation Support and Maintenance.
  • Business Requirements review, assessment, gap identification, defining business process, deliver project roadmap including documentation, initial source data definition, mapping, detailed ETL development specifications and operations documentation.
  • Expertise in data warehousing, ETL architecture, Data Profiling.
  • Experience in working with various versions of Informatica Power center 9.0, 8.6/8.5/8.1/7.1/6.2 - Client and Server tools.
  • Expertise in tuning the performance of mappings and sessions in Informatica and determining the performance bottlenecks.
  • Experience in creating pre-session and post-session scripts to ensure timely, accurate processing and ensuring balancing of job runs.
  • Experience in integration of various data sources like SQL Server, Oracle, Tera Data, Flat files, DB2 Mainframes.
  • Strong experience in complex PL/SQL packages, functions, cursors, triggers, views, materialized views, T-SQL, DTS.
  • Expert in troubleshooting/debugging and improving performance Confidential different stages like database, workflows, mapping.
  • Thorough Knowledge of different OLAP’s like DOLAP, MOLAP, ROLAP, HOLAP.
  • Intense Knowledge in designing Fact & Dimension Tables, Physical & Logical data models using ERWIN 4.0 and Erwin methodologies like Forward & Reverse Engineering.
  • Involved in writing Unit test cases for complex scenarios.
  • Experience in creating UNIX shell scripts and Perl scripts.
  • Knowledge in design and development of Business Intelligence reports using BI tools Business Objects, Cognos and knowledge in micro strategy.
  • Knowledge in Installation and configuration of Informatica server with sql server, oracle and able to handle the Informatica administrator tasks like Configuring DSN, creating Connection Strings, copying & moving mappings, workflows, creating folders etc.
  • Experience includes thorough domain knowledge of Business Financial system, Banking, healthcare information technology, Insurance & Reinsure, Pharmacy claims systems, Telecom Industry.
  • Enthusiastic and goal-oriented team player possessing excellent communication, interpersonal skills and leadership capabilities with high level of adaptability.

TECHNICAL SKILLS

Data warehousing: Informatica Power Center 9.0, 8.6/8.5/8.1/7.1/6.2/5.1, Power Exchange for DB2, Metadata Reporter Data Profiling, Data cleansing, OLAP, OLTP, Star & Snowflake Schema, FACT & Dimension Tables, Physical & Logical Data Modeling, Data Stage 7.x, Erwin 4.0

BI Tools: Business Objects XIR 2/6.0/5.1/5.0, Micro strategy 8,Cognos 8

Databases: SQL Server 2005/2000/7.0/6.5, Oracle 11 i/ 10g/9i/8i/8/7.3, Sybase, Tera Data 6,My SQL, MS-Access, DB2 8.0/7.0

Languages: C, C++, XML, SQL, T-SQL, PL/SQL, UNIX Shell Scripting

Operating System: HP-UX11/10/9,IBM-AIX4.0/3.1, Sun Solaris 9/8/7/2.6/2.5, SCO-UNIX, LINUX, Windows XP Professional/2000/NT/98/95

Other Tools: MS Visual Source Safe, PVCS, Autosys, Control M, Remedy, Mercury Quality center, Star Team.

DB Tools: SQL*Plus, SQL*Loader, Export/Import, TOAD, SQL Navigator, SQL Trace

PROFESSIONAL EXPERIENCE

Confidential, Columbus OH

Senior Informatica ETL BI developer

Responsibilities:

  • Estimation, Requirement Analysis and Design of mapping document and Planning for Informatica ETL and Data Stage and acted as PR4,PR3 programmer
  • Acted as Team lead in coordinating offshore ETL Development
  • Worked on upgrading and data analysis of ERIC (Employment resource information center) data modeling.
  • Setting up Metadata driven utility design for Data Extraction, Derive, Transformation, Loading and Validation processes using Informatica 9.0.
  • Designed and created logical & physical data models to support strategic business Decisions.
  • Used transformations like Aggregators, lookups, Filters & Sequence and many more as per Business requirement.
  • Developed complex Pl/Sql procedures and packages as part of Transformation and data cleansing.
  • Developed UNIX shell scripts to control the process flow for Informatica workflows to handle high volume data and documented using Microsoft office.
  • Set up batches and sessions to schedule the loads Confidential required frequency using Power Center Workflow manager and accessing Mainframe DB2 and AS400 systems.
  • Extensively used Informatica debugger to validate Mappings and to gain troubleshooting Information about data and error conditions.
  • Performance tuning to ensure optimal session performance and worked in troubleshooting ETL ODI applications.
  • Designed, developed Informatica mappings, enabling the extract, transport and loading of the data into target tables and loading into the Teradata.
  • Created Workflow, Worklets and Tasks to schedule the loads Confidential required frequency using Workflow Manager and passed the data to Microsoft SharePoint.
  • Designed and developed Informatica mappings for data loads and data cleansing.
  • Created complex mappings using Aggregator, Expression, Joiner transformations
  • Involved in generating reports from Data Mart using OBIEE and working with Teradata.
  • Defects were tracked, reviewed and analysed.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Performed Configuration Management to Migrate Informatica mappings/sessions /workflows from Development to Test to production environment.
  • Designed and developed process to handle high volumes of data and high volumes of data loading in a given load intervals.

Environment: Informatica Power Center 9.0 version, SQL Server 2005/2000, Oracle 11i, Teradata 6, SQL, PL/SQL,IBM AIX, UNIX Shell Scripts, Cognos, Erwin, STAR team, Remedy, Maestro job scheduler, Mercury Quality Center

Confidential, Columbus Ohio

Senior Informatica ETL/BI Developer/Data Analyst

Responsibilities:

  • Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build the Data Mart.
  • Acted as Team lead in coordinating offshore ETL Development for EDI applications
  • Developed a standard ETL framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.
  • Assisted in designing Logical/Physical Data Models, forward/reverse engineering using Erwin 4.0.
  • Developed mappings to extract data from SQL Server, Oracle, Teradata, Flat files and load into Data Mart using the Power Center and acted as PR4/PR3 programmer.
  • Developed common routine mappings. Made use of mapping variables, mapping parameters and variable functions, worked on integration services and reporting services SSIS & SSRS.
  • Used Informatica Designer in Informatica 9.0 to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
  • Developed Slowly Changing Dimension for Type 1 SCD and worked on ETL ODI applications.
  • Used mapplets for use in mappings thereby saving valuable design time and effort
  • Used Informatica Workflow Manager to create, schedule, execute and monitor sessions, worklets and workflows accessed information using DB2 and AS400.
  • Written procedures, Queries to retrieve data from DWH and implemented in DM.
  • Data extraction and data Transfer from and to SQL Server Database using utilities / tools like BCP, and BULK INSERT and work on contingency plan using SQL Queries.
  • Published and scheduled Business Objects Reports to users using Scheduler and updated data to Microsoft SharePoint.
  • Used Store Procedure as Data provider to retrieve data from scheduled tables and complex queries
  • Designing Business Objects Universe based on the XI Repository and developing Business Objects reports and Crystal Reports and documented using Microsoft office.
  • Involved in Design phase and Developed Logical Model and Physical Model of EPM.
  • Implemented Crystal Reports using the Business Objects universes.
  • Developed critical Web and Desk reports like drill down, Slice and Dice, master/detail for analysis of parts benefits and headcount
  • Written shell scripts to run the workflows and automated them through maestro job scheduler.
  • Responsible for submitting DBA requests, following up with data base DBA’s and Informatica administrators, creating remedy ticket, handling all the singoff’s for the production, updating all changes with new versions using START team, creating test cases using Quality Center.
  • Written SQL Queries, Triggers, and PL/SQL Procedures to apply and maintain the Business Rules.
  • Troubleshooting database, workflows, mappings, source, and target to find out the bottlenecks and improved the performance.
  • Written Indexes, primary keys and checked other performance tuning Confidential data base level.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.
  • Created testing metrics using MS-Excel

Environment: Informatica Power Center 9.0, SQL Server 2005/2000, Oracle 11i/10g, Teradata 6, SQL, PL/SQL,IBM AIX, UNIX Shell Scripts, Cognos, Erwin, STAR team, Remedy, Maestro job scheduler, Mercury Quality Center

Confidential, Lansing MI

Sr Informatica/Cognos Developer

Responsibilities:

  • Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build Data Warehouses.
  • Developed a standard ETL framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.
  • Identified all the dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables, accessed AS400 mainframe systems.
  • Tested the reports like Drill Down, Drill up and pivot reports generated from Cognos.
  • Assisted in designing Logical/Physical Data Models, forward/reverse engineering using Erwin 4.0.
  • Developed mappings to extract data from SQL Server, Oracle, Teradata, Flat files, DB2, Mainframes and load into Data warehouse using the Power Center, power exchange.
  • Developed common routine mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
  • Made use of mapping variables, mapping parameters and variable functions.
  • Developed Slowly Changing Dimensions for Type 1 SCD and Type 2 SCD
  • Export/Import data from different data bases, flat files using DTS package and BCP by defining source and target using Microsoft SharePoint.
  • Checked performance tuning/debugging Confidential different levels like workflows, mappings, database etc,. And documented using Microsoft office.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.
  • Created testing metrics using MS-Excel
  • Used Source Analyser and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Performed Configuration Management to Migrate Informatica mappings/sessions /workflows from Development to Test to production environment.
  • Using cognos, Performed analysis, design, and document report requirements also did performance tuning for ETL ODI applications.
  • Using cognos, Developed Business Objects Reporting and Dashboard Universes
  • Using cognos, Worked with users and coordinating with application developers to understand and capture data/reporting requirements
  • Using cognos, Created and documenting Business Objects universe design specifications
  • Using cognos, Created and documenting Business Objects report specifications with detailed database mapping
  • Using cognos, Developed the Web Intelligence and Full Client reports
  • Performed System Testing, Regression Testing, Acceptance Testing, Functional Testing and Stress Testing.
  • Created reports like Master/Detail reports, Cross Tab reports, slice and dice reports, and drill down reports

Environment: Informatica Power Center 7.1, Informatica Power Exchange, SQL Server 2000, Teradata 6, Oracle 9i, DB2,SQL, PL/SQL, Mainframes, Sun Solaris, UNIX Shell Scripts, Business Cognos 8, Erwin, Autosys, Remedy.

Confidential, Lansing MI

Sr. Cognos Developer and Administrator

Responsibilities:

  • Worked closely with business users while gathering requirements, analyzing data and supporting existing reporting solutions.
  • Involved in upgrading from cognos8.4 to cognos 10.1.
  • Designed and developed Framework Manager Models for SOA (Statement of Accounts), Monthly Financial Reports, Income Statements, Annual Financial Reports and 12-Month Expenditure & Income Reports using Oracle database as the source.
  • Created Framework Manager Models with Multiple Namespaces, calculated columns, filters and Macros from Oracle data source and published packages to suit the reporting needs of the client.
  • Created Dimensionally Modeled Relation (DMR) data sources for the Annual Financial Reports.
  • Designed and developed complex Cognos reports using Report studio (list, summary, cross-tab, repeater and drill through reports).
  • Developed various complex and customized (Inserting tables and Blocks) reports using prompts like value prompt, Tree prompt, date prompt, cascading prompt, Select & Search Prompt and using conditional formatting as well.
  • Created, refreshed and supported Cognos Transformer models and cubes to support timely report execution.
  • Bursted reports to email and also to system file folder based on the Unit Code Hierarchy.
  • Involved in theCognosAdministration tasks like creating new Data source Connections and Creating Packages, handling the Deployment of reports copying from Development to Production server and also handling the user permissions on the Packages and Folders.
  • Involved in moving the Cognos content store form MS SQL Server to Oracle.
  • Created Version tables to archive fact data and also dimensions based on the month version in UTSA’S Finance Schema.
  • Created Data Dictionaries to document database relationships & Data Models.
  • Involved in intensive end user training (both Power users and End users in Report studio and Query studio) with excellent documentation support.

Environment: Cognos BI 10, Cognos BI 8.4/8.4.1, Framework Manager, Report Studio, Query Studio, Transformer, Power play Client, Oracle 10g/9i, SQL, SQL Developer, Windows 2008 R2/7, Toad

Confidential, Columbus, Ohio

Cognos Developer

Responsibilities:

  • Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build Data Warehouse.
  • Assisted in designing Logical/Physical Data Models, forward/reverse engineering using Erwin 4.0.
  • Developed mappings to extract data from SQL Server, Oracle, Flat files, XML files, and load into Data warehouse using the Mapping Designer.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
  • Developed Slowly Changing Dimensions for Type 1 SCD and Type 2 SCD
  • Developed models in Framework Manager and deployed packages to the Cognos Connection
  • Customized data by adding filters Confidential both the Framework Level and Report Level
  • Published different packages from Framework Manager to Cognos Connection
  • Analyzed Relationships between tables in the schema and applied cardinality in the database layer
  • Implemented all business logic in Framework Manager by creating prompts, folders, calculated columns, joins and naming convention
  • Worked on creating and analyzing the complex reports like Cross tab, drill-thru reports.
  • Page explorer, query explorer and variable explorer were used to manage the content of the reports.
  • Created multiple page reports with page-break based on the first page of report using master-detail relationship.
  • Bursting the reports in various formats, such as html, PDF, excel and sending via e-mail based on the sales area and different levels of administration
  • Created Dashboards comprising of list reports, cross-tab reports and chart reports using underlying multiple query structure by using filters, calculations, complex expressions, and conditional formatting.
  • Assisted in creating executive dashboard for management level reports using Report Studio.
  • Worked on complex filter condition, query join logic, conditional formatting, various type of prompts like value prompt, search & select prompt, Text Box prompt and Date & Time prompt
  • Applied different functionalities like filters, prompts for Cognos reports to retrieve relevant data
  • Created views, processes, dimensions in the process of creating cube and applied rules to the cube
  • Used layout component reference in page header, page footer and prompt page for uniformity across the reports.
  • Changed the appearance of the reports in Query Studio by reordering the report items, swapping rows and columns and by limiting the number of rows that should appear on a page
  • Used Cognos Connection to administer and scheduling reports to run Confidential various intervals
  • Scheduled and Distributed reports using Schedule Management in Cognos Connection.
  • Set security for individual reports in Cognos Connection.
  • Upgraded the reports from Cognos 8.4 to Cognos 10

Environment: Cognos 10/8.4 Framework Manager, Report Studio, Query Studio, Analysis Studio, Transformers, Cognos Connection, SQL Server 2000/2005, HTML, MS-Access, MS- Excel, Windows XP

Confidential, Seattle WA

Informatica developer

Responsibilities:

  • Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into Business intelligence database.
  • Based on the EDS business requirements created Functional design documents and Technical design specification documents for ETL Process.
  • Reviewed data models using Erwin tool to find out data model dependencies.
  • Designing and developing ETL solutions in Informatica Power Center 7.1.4
  • Designing ETL process and creation of ETL design and system design documents.
  • Developing code to extract, transform, and load (ETL) data from inbound flat files and various databases into various outbound files using complex business logic.
  • Created automated shell scripts to transfer files among servers using FTP, SFTP protocols and download files from web servers and hosted files.
  • Developed Informatica mappings, enabling the ETL process for large volumes of data into target tables.
  • Designed and developed process to handle high volumes of data and high volumes of data loading in a given load window.
  • Effectively used all the kinds of data sources to process the data and finally creating load ready files (LRF) as out bound files which are inputs to the BID.
  • Created Workflow, Worklets and Tasks to schedule the loads Confidential required frequency using Maestro scheduling tool. Created Maestro control files to handle job dependencies.
  • Expertise in writing BTEQ scripts in Teradata and running them by writing korn shell scripts in HP UNIX and Sun OS environments.
  • Expertise in creating MLOAD, Fast load and T Pump control scripts to load data to the BID.
  • Expertise in creating control files to define job dependencies and for scheduling using maestro tool.
  • Involved in jobs scheduling, monitoring and production support in a 24/7 environment.

Environment: Informatica Power Center 7.1.4, ETL, Teradata V2R5 as a BID, Business Objects, Oracle 10g/9i/8i, HP - Unix, Sun OS, Perl scripting, Erwin, PL/SQL, Maestro for scheduling.

Confidential, Sacramento, California

Informatica Developer

Responsibilities:

  • Based on the requirements created Functional design documents and Technical design specification documents for ETL Process.
  • Designed, developed Informatica mappings, enabling the extract, transport and loading of the data into target tables.
  • Created Workflow, Worklets and Tasks to schedule the loads Confidential required frequency using Workflow Manager.
  • Designed and developed Informatica mappings for data loads and data cleansing.
  • Created complex mappings using Aggregator, Expression, Joiner transformations.
  • Designed and developed process to handle high volumes of data and high volumes of data loading in a given load intervals.
  • Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Designed and developed table structures, stored procedures, and functions to implement business rules.
  • Designed and implemented mappings using SCD and CDC methodologies.
  • Designed and developed process to handle high volumes of data and large volumes of data loading in a given load window.
  • Extensively involved in migration of ETL environment, Informatica, Database objects.
  • Involved in splitting of Enter price data warehouse environment and Informatica environment in to 3 of each company.
  • Involved in jobs scheduling, monitoring and production support in a 24/7 environment.

Environment: Informatica Power Center 7.1.1, Teradata V2R6, Micro Strategy, MS Sql server 2000, Oracle 10g, 9i/8i,Trillium, HP Unix, Perl scripting and Windows 2000, Erwin 4.2, PL/SQL.

Confidential, Portland Oregon

Informatica Developer

Responsibilities:

  • Involved in Analysis, Requirements Gathering and documenting Functional & Technical specifications
  • Analyzed and created Facts and Dimension tables.
  • Designed ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning and object creation) for Oracle as per business requirements using Erwin
  • Used DB2, legacy systems, Oracle, and Sybase sources and Oracle as target.
  • Developed Informatica Power Center mappings for data loads and data cleansing.
  • Wrote stored procedures in PL/SQL and Unix Shell Scripts for automated execution of jobs
  • Wrote Shell Scripting for Informatica Pre-Session, Post-Session Scripts.
  • Designed technical layout considering Standardization, Reusability, and Scope to improve if need be.
  • Documented the purpose of Data Warehouse (including transformations, mapplets, mappings, sessions, and batches) so as to facilitate the personnel to understand the process and in corporate the changes as when necessary.
  • Developed complex mappings to extract source data from heterogeneous databases Tera- Data, SQL Server Oracle and flat files, applied proper transformation rules and loaded in to Data Warehouse.
  • Involved in identifying bugs in existing mappings by analyzing data flow, evaluating transformations using Debugger.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
  • Worked closely with Production Control team to schedule shell scripts, Informatica workflows and pl/sql code in Auto-sys.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.
  • Defects were tracked, reviewed and analyzed.
  • Conducted UAT (User Acceptance Testing) with user community
  • Developed K-shell scripts to run from Informatica pre-session, post session commands. Set up on Success and on Failure emails to send reports to the team.

Environment: Informatica, Oracle, PL/SQL, Cognos Impromptu 6.0, Cognos Power Play 6.6, Oracle 9i, Erwin 4.0, UNIX, Windows NT.

Confidential

ETL Developer

Responsibilities:

  • Used Informatica Workflow Manager to create, schedule, execute and monitor sessions, worklets and workflows.
  • Created new mappings according to business rules to extract data from different sources, transform and load target databases.
  • Debugged the failed mappings and fixed them.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.
  • Defects were tracked, reviewed and analysed.
  • Modified the mappings according to the new changes and implemented the persistent cache in several mappings for better performance.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
  • Involved in writing stored procedures and shell scripts for automating the execution of jobs in pre and post sessions to modify parameter files, prepare data sources.
  • Identified the issues in sources, targets, mappings and sessions and tuned them to improve performance.
  • Created and used reusable mapplets and worklets to reduce the redundancy.
  • Developed robust Informatica mappings and fine-tuned them to process lot of input records.

Environment: Informatica power center, Oracle9i, SQL, PL/SQL, Solaris, MS Vision, Ms-Access

We'd love your feedback!