We provide IT Staff Augmentation Services!

Informatica Developer/data Warehouse Analyst Resume

Ashburn, VA

SUMMARY:

  • 8+ years of total IT experience in the Analysis, Design, Modeling, Development, Implementation and Testing of Data Warehouse applications and acquired excellent analytical, co - ordination, interpersonal skills, have immense leadership potential.
  • Excellent knowledge of Software Development Life Cycle (SDLC), Adhered to standards/guidelines throughout the development life cycle and maintenance of the models.
  • Possess strong Documentation skill and knowledge sharing among Team, conducted data modeling sessions for different user groups, facilitated common data models between different applications, participated in requirement sessions to identify logical entities
  • Extensive Experience working with business users/SMEs as well as senior management.
  • Strong understanding of the principles of Data warehousing using Fact Tables, Dimension Tables, star and snowflake schema modeling.
  • Strong experience with Database performance tuning and optimization, query optimization, index tuning, caching and buffer tuning.
  • Extensive experience in relational Data modeling, Dimensional data modeling, logical/Physical Design, ER Diagrams, forward and reverse engineering, Publishing ERWIN diagram, analyzing data sources, creating interface documents etc.
  • Extensive experience in Enterprise Information Management and Architecture technologies including Information Lifecycle, Master Data Management and Business Intelligence.
  • Extensively used ERWIN to design Logical/Physical Data Models, forward/reverse engineering, publishing data model to acrobat PDF files. Provided specifications for Data Model design, Logical and Physical Database design.
  • Around 4+ years of strong experience in installation, configuration and administration of Informatica Power Center Client and Server
  • 4+ years of recent hands-on experience with QlikView Desktop, QlikView Management Console (QMC), QlikView Publisher, QlikView Web Server.
  • Expertise in QlikView Hierarchies, QlikView Architecture, Data-warehousing and Dimensional modeling.
  • Experience in loading data from multiple data sources like DB2, Netezza, SQL Server, Oracle, Amazon Redshift, MS Access, Excel, Html, xml etc.
  • Expert in developing QlikView Dashboards using Chart (Drill down & Cyclic Grouping), List Box, Input Field, Table Box and Calendar & Date Island.
  • Strong knowledge of QlikView scripting, Set analysis, aggregation functions, multi-dimensional objects and layout and design including chart properties and formats. Expertise in Using QlikView Functions (Date and Time, Keep, Join, Mapping, String & input Fields etc.
  • Strong knowledge in basic components of QlikView Enterprise like List Box, Multi Box, Tables, Charts, Current Selection Box, Buttons, Containers etc., and experience in editing scripts for creating variables, fields, and calculated expressions, basic and aggregated functions.

TECHNICAL SKILLS:

ETL Tools: Informatica Powercentre 9.x/8.x, Datastage

Languages: SQL, PL/SQL, C, Java

Database Technologies: Oracle SQL Developer 3.0.04, Oracle11g/10g, MS SQL Server, TeraData

Business Intelligence Tools: QlikView 9.0, 10.0/11.0/11.2 /12 , OBIEE,Cognos, Pandora from Experian

Operating Systems: Windows Hyper-V, Windows 9/10, Cent OS, Ubuntu, Unix, Linux

PROFESSIONAL EXPERIENCE:

Confidential, Ashburn, VA

Informatica Developer/Data warehouse analyst

Responsibilities:

  • Involved in all phases of SDLC from requirement, design, development, testing, pilot, training and rollout to the field user and support for production environment.
  • Consulted with application development business analysts to translate business requirements into data design requirements used for driving innovative data designs that meet business objectives. Expertise in relational data design and modeling, system study and development by applying Ralph Kimball methodology of dimensional modeling.
  • Performed requirements and design analysis for two customer development projects: one for automatic periodic investment for mutual fund purchases and customer contact history requirements for customer relationship management. Led information-gathering meetings with users to gather business requirements. Used event analysis and data modeling techniques to gather and document requirements. Delivered business requirements document and preliminary logical data model.
  • The goal of the client was to re-engineer their current systems that supported their credit and debit card transaction processing. Performed as the lead for the business requirements gathering task. Interviewed users and extracted business requirements and documented current process flows. Wrote and delivered the high-level business requirements document, process flows and a conceptual data model. Conducted structured walkthroughs with the entire user community.
  • Supported data warehouse projects with logical and physical data modeling in Oracle environment. Used a structured change management approach for coordinating the timely and accurate delivery of the data model, documentation changes and DDL to the development teams.
  • Documented the data requirements and system changes into detailed functional specifications, created data mapping documents from various source data feeds to the target databases.
  • Used Informatica power center for Extraction, Transformation, and Loading. Created mappings, Mapplet,sessions. identified and measure data quality, design and implement data profiling and data quality improvement solution to analyze, match, cleanse, and consolidate data before loading into data warehouse
  • Involved in using the Stored Procedures, Functions, Materialized views and Triggers at Data Base level and imported them in Informatica for ETL.
  • Designed and developed QlikView software applications and complex server-based QlikView dashboards from scratch to support reporting and Business Intelligence initiatives.
  • Direct interaction with stake holders and clients on daily basis, requirement gathering, requirement analysis, work estimations, design, developed, testing and migrating to production service of QlikView reports.
  • Designed and development of QlikView platform to integrate with Expandable Databases
  • Understanding source systems and Designing Data Modeling and Loading data from Multiple Data sources like Amazon Redshift, Access, .CSV files.
  • Connected to Amazon Redshift using OLE DB and ODBC data sources connector and tables.
  • Configuring NPrinting on Demand Extension object on a Clustered QlikView Environment.
  • Worked on data modeling and developed different complex models as per given user requirement without any closed loops, Synthetic tables and ambiguous relationships. Creating complex expressions for dynamic aggregation and Actions and triggers, Document Chaining & linking experience.
  • Creating Pixel Perfect reports, Excel reports, Word reports, PowerPoint reports using NPrinting designer and automating the jobs in NPrinting Server.
  • Published Client application using QlikSense Publisher and rolled out to Access point and administered security Using Active Directory technology/Access DB integration Created Master Calendar and worked with date fields as per the user needs.
  • Developed QlikView Dashboards using different Charts, Drill through, Cyclic Groups, List, Input Field, Table Box, Container, Variables, Calendars, Sliders and Bookmark objects.
  • Published dashboards on Access point. Deployed QVW's on the production server, Created Reload and Distribution tasks on QlikView Server/publisher.
  • Involved in Production Support for QlikView applications & Server

Environment: Informatica power center 9.6/9.1, QlikView 12, Qlik Sense, Oracle, Netezza, SQL Scripts, VB Script, MS Access, XML files.

Confidential - St. Louis, MO

Data Analyst/ETL developer

Responsibilities:

  • Identify the functional & Technical requirements for enterprise data warehouse solution.
  • Define the ETL mapping specification from business requirements provided by BA.
  • Design the ETL process to source the data from sources and load it into DWH tables.
  • Involved in Data Warehouse landing zone schema Design and provided Data granularity to accommodate time phased CDC requirements.
  • Design/Develop/Monitor/Schedule Workflows, Sessions, Event based tasks, parameter files, pre-session/post-session tasks for development/UAT/SIT environment.
  • Developed Advanced Linux shell scripts to validate source files, automated archival of Log files, create ETL event start/stop files.
  • Provided performance optimized solution to eliminate duplicate records.
  • Identified and resolved performance bottlenecks in Source, Target, Mappings, Database related, Unix File server.
  • Provide the team with technical leadership on ETL Design, Development best practices, Version Controlling and customization of Data loads.
  • Worked on SAP BO to Qlik migration of Finance Module of Confidential . Worked on metadata analysis of SAP BO universes and reports.
  • Analyzed and short-listed Critical universes and reports which can be migrated to QlikView and Qlik Sense.
  • Work with senior management to plan, define and clarify migration goals, objectives and requirement.
  • Created QVDs for a universe to develop Data Model.
  • Gathering data sources like Teradata, Excel, SQL Server 2008, binary QVW, Inline & txt files in developing the QlikView Data Models.
  • Developed data models based on the client requirements and without any ambiguous or Synthetic tables.
  • Design, Develop, Test, debug and implement Data Models based upon specified requirement.
  • Developed Dashboards using all functions of QlikView and Qlik Sense similar to the reports in SAP BO.
  • Developed Qlik Sense Applications using features such as KPI Object, Pivot Table, etc.
  • Used QlikView Functions (Date and Time, Keep, Join, Mapping, String & input Fields etc )
  • Design, Develop, Test, debug and implement Qlik Sense Applications based upon specified requirement.

Environment: Informatica 9.6/9.1 QlikView11, Qlik Sense, NPrinting, Teradata, SAP BO, MS-excel, SQL Server 2008

Confidential - Hoboken, NJ

Data Analyst/ETL developer

Responsibilities:

  • As a Data Analyst responsible for Designing, modeling, and creating database on basis of third-party data structure, Normalizing or denormalizing data according to business requirements and Facilitated Data Model Walkthroughs with Data modelers and Users.
  • Worked on Master Data Management (MDM), analytics and data integration using Service Oriented Architecture (SOA).
  • Worked on architectural review and developed programs for data governance and metadata management.
  • Participated in Business Analysis, talking to business Users and determining the entities for Data Model.
  • Evaluated and enhanced current data model as per the requirement
  • Analyzed the source data coming from Oracle, SQL server, SAS, Lawson and flat files. Worked with Data Warehouse team in developing Dimensional Model. Designed and developed Star Schema and created Fact and Dimension Tables for the Warehouse using Erwin.
  • Designed ETL specification documents to load the data in target using various transformations according to the business requirements.
  • Extensively used Informatica- Power center for extracting, transforming and loading into different databases.
  • Identifying the requirements from business users and suggesting best ways of presenting charts, which are accepted and successfully implemented in the dashboards.
  • Developed QlikView Dashboards using different Chart, Drill down, Cyclic Groups, List, Input Field, Table Box, Container etc. ‘
  • Work with senior management to plan, define and clarify dashboard goals, objectives and requirement.
  • Design, Develop, Test, debug and implement QlikView solutions based upon specified requirement.
  • Re-use the existing QVDs, new tables as QVDs and scripts to generate appropriate fields that help in modeling the data according to the requirements.
  • Developed data models based on the client requirements and without any ambiguous or Synthetic tables.
  • Used QlikView Functions (Date and Time, Keep, Join, Mapping, String & input Fields etc )
  • Extract, transform and load data from multiple sources into QlikView applications.
  • Created Dashboards style of reports using QlikView and Qlik Sense components like List Box, Slider, Buttons, Charts and Bookmarks.
  • Provided ongoing maintenance and support to existing QlikView documents.

Environment: QlikView 11, Oracle 11g, DB2, MS SQL Server 2008, Toad, Informatica 9.x, Windows 2008 R2, Excel, MS Access, Flat files

Confidential, NC

Data analyst/Modeler

Responsibilities:

  • Participated in Joint Application Design/Development (JAD) sessions and involved in discussions with clients/users to determine the dimensions, hierarchies and levels, basic measures, derived measures and key metrics for relational / multidimensional data model for the system, understanding reporting requirements, designing star schema data model. Extensively involved in requirement analysis.
  • Experienced using ERwin for Data modeling for creating Logical and Physical Design of Database and ER Diagrams with all related entities and relationship with each entity based on the rules provided by the business manager. Worked extensively in forward and reverse engineering processes.
  • Participated in Requirement Gathering, Business Analysis, User meetings, discussing the issues to be resolved and translating user inputs into ETL design documents.
  • Worked on the Data Warehouse team analyzing data files that had to be compiled from disparate non-production sources and readied them for production. Tasks included: comparing data to requirements documentation, creation of data layouts, data dictionary, and the pipe delimited and fixed width files. In addition, was team lead for managing the completion and loading of these files for the Data Management group
  • Participated as member of a project team performing various lifecycle tasks for an intranet application with Oracle database for tracking public offering orders. Developed logical and physical database models and data dictionary documentation. Worked as liaison between users and developers refining and explaining user requirements.
  • Assess data quality requirements in terms of data completeness, consistency, conformity, accuracy, referential integrity, duplication and evaluate vendors (Informatica Data Explorer/Data Quality, identify and measure data quality, design and implement data profiling and data quality improvement solution to analyze, match, cleanse, and consolidate data before loading into data warehouse.
  • Develop data quality reports/dashboards with Cognos Framework Manager and Report Studio.
  • Designed the relational data model for operational data store and staging areas, Designed Dimension & Fact tables for data marts.
  • Extensively involved in various data modeling tasks including forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas, creating DDL scripts, data dictionary, publishing model to PDF and HTML format, generating various data modeling reports etc..
  • Designed a Conceptual Data Model, Logical Data Model, and Physical Data Model using Erwin.
  • Worked as a Data Warehouse and Database Developer. Core responsibility is to interview with customers and subject matter experts to gather the requirements for a proposed database analyze these requirements and source systems required to build the data warehouse using Data Warehouse Builder.
  • Data Architect in a centralized group doing logical and physical data models in ERWIN. Also, wrote database interface specifications and documented in Data Manager data dictionary.
  • Designed Informatica ETL Mappings documents, created ETL staging area framework, created data mapping documents, data flow diagram, ETL test scripts etc.
  • Extensively used Erwin and Normalization Techniques to design Logical/Physical Data Models, relational database design.
  • Experience with utilities Informatica Data Quality (IDQ) and Informatica Data Explorer (IDE).
  • Conducted data modeling sessions for different user groups, facilitated common data models between different applications, participated in requirement sessions to identify logical entities.
  • Extensively involved in various data modeling tasks including forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas, creating DDL scripts, data dictionary, publishing model to PDF and HTML format, generating various data modeling reports etc..
  • Extensively involved in creating, maintaining, updating documents like requirement document, database design document, System design document, naming standard procedures, SOPs etc.
  • Analyzed and optimized the existing business processes using Conceptual Models and Data Flow Diagram. The process included performance enhancement requirements assessment and mapping of existing work flow and data flows.

Environment: Erwin, Informatica Power Center v 8.6/9.1, IBM Data Stage, Power Exchange, DB2v9.5, XML/XSD, Qwest Center, MS SQL Server, Teradata, RedHat Linux Shell Scripts, PL/SQL, Cognos, Test Track Pro, TOAD, PVCS, Microsoft Project Plan

Confidential, New Brunswick, NJ

Data warehouse Analyst

Role & Responsibilities:

  • Worked on Informatica - Source Analyzer, warehouse designer, Mapping Designer, Mapplet and Transformations.
  • Involved in designing the procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Decision making Units in tables.
  • Data modeling by ER diagrams and process modeling using data flow diagram in the design process.
  • Created UNIX Environment variables in various .ksh files.
  • Analyzed the Technical, organizational, and economical feasibility of the project.
  • Created Informatica mappings with PL/SQL procedures/functions to build decision rules to load data.
  • Extensively used most of the transformations such as the Source qualifier, Aggregators, Lookups, Filters and Sequence.
  • Extensively Used ETL to load data from different databases and flat files to Oracle.
  • Involved in the data migration from COBOL files mainframe-AS400 to DB2.
  • Extensively worked on Database Triggers, Stored Procedures, Functions and Database Constraints.
  • Created the Universes using Business Objects Designer.
  • Created Hierarchies to provide the drill down functionality to the end-user.
  • Created various classes, objects, Filters, Conditions in the Universes.

Environment: Informatica Powercenter 9.x,SQL Server,Oracle 11g,, Embarcadero, UNIX

Hire Now