Data Analyst/etl Developer Resume
Mount Laurel, NJ
SUMMARY
- 8 years of experience in the Information Technology (IT) Industry serving as a Data Analyst in a Data warehousing environment.
- Vast experience of working in the area data management including data analysis, gap analysis, data mapping, data validation along with data profiling, data cleansing, data scrubbing and reference of data.
- Worked extensively with Dimensional Modeling and Data Migration.
- Experience in Data Transformation, Metadata, and Data dictionary, Data Loading, Modeling and Performance Tuning.
- Experience in DBMS/RDBMS implementation using object - oriented concept and database toolkit.
- Data Warehousing experience using Informatica Power Center 9.5/9.1/8.6. /8.1/ 7. x, Power connect, Repository Admin console, Repository Manager, Designer, Workflow Manager, Workflow Monitor, ETL, Data mart, OLAP, OLTP.
- Experience with Database Systems like Oracle 11g/10g/9i, MS SQL Server, Teradata and DB2.
- Excellent experience in extraction, transformation and loading the data from various source databases to target database.
- Implemented Slowly Changing Dimensions Type 1 and Type 2, methodology for accessing the full history of accounts and transaction information.
- Experience in preparing project cutover plan, test plan with test cases for unit test, regression test and User Acceptance Testing(UAT)
- Highly skilled in System Analysis, ER/Dimensional Data Modeling, Database design and implementing RDBMS specific features.
- Extensive experience in Data Analysis and ETL Techniques for loading high volumes of data and smooth structural flow of the data.
- Experience with creating reports using Business Objects.
- Created and executed Test Plan, Test Scripts, and Test Cases based on Design document and User Requirement document for testing purposes.
- Extensive experience in the testing environment, which included User Acceptance Testing (UAT), functional testing and system testing and defect tracking using Clear Quest.
- Coordinated and prioritized outstanding defects and system requests based on business requirements. Acted as a liaison between the development team and the management team to resolve any conflicts in terms of requirements.
- Experience with Star schema modeling and knowledge of Snowflake dimensional modeling.
- Experience with SQL, PL/SQL objects. Involved in creating Packages, writing Procedures and Functions. Tuning using various types of Partitioning and Indexes.
- Experience in providing Production Batch support.
- Interacted with all levels of the project development team, from end users to Software Architects, Technical Lead, Database Administrators, and System Administrators.
TECHNICAL SKILLS
Database: Teradata, DB2, Oracle, SQL Server 2012, Access DB
Data Warehouse Concept: Star Schema, Snowflake schema, OLTP, OLAP
Operating Systems: UNIX, Windows
Languages: SQL, PL/SQL
Reporting Tools: Business Objects XI, Cognos7.x/8.x/10.x,QlikView
ETL Tools: Informatica 9.5/9.1/8.6/8.5/8.1 , Data stage,SSIS
Modelling Tool: Erwin, Power designer, MS-Visio
Desktop Software: Microsoft Word, Excel, and PowerPoint
PROFESSIONAL EXPERIENCE
Confidential, Mount Laurel, NJ
Data Analyst/ETL developer
Responsibilities:
- Analyzed the source data coming from Legacy system, Oracle, DB2, sql server, PeopleSoft and flat files. Worked with Data Warehouse team in developing Dimensional data Model.
- Co-ordinated with different application team to develop and standardize enterprise wide data model, created dimensional data model using Ralph Kimball Methodology, Designed and developed fact tables, dimension tables, conformed fact and dimension tables.
- Analyzing the reference of data which is extract from source table and transform to the target table by using the sql query.
- As an Architect responsible for Designing, modeling, and creating database and Normalizing or denormalizing data according to business requirements and Creating Star and snowflake schemas.
- Having experience in Ab Initio Data Quality Suite which is an ETL application and used for drag and drop component and handling large amount of data.
- Extensively interacted with user for requirement gathering, prototyping and prepared various documents like Interface Requirement Document, Customer Requirement document, Integration test plan, Unit test plan, Release notes etc.
- Achieving a successful data migration requires clear allocation of responsibility for deliverables from the outset.
- Development and implementation of an automated process to test FX washes and setting thresholds for Actimize alerts using Excel, importing bulk data from remote database sources for analysis.
- Extensively worked on Analyzing data, applying business rules, designing the data warehouse, creating data mart/ schemas, partitioning the data, creating easy/quickest ETL process to load the data and automating the ETL process.
- Participated in Complete formal Design process from initial specifications and requirement. Involved in creating technical design documentation.
- As a team lead worked closely with the entire team member to assign the work and get the updated status of work as well as helped them to complete their job wherever required.
- Extensively used ERWIN to design Logical/Physical Data Models, forward/reverse engineering, publishing data model to acrobat files.
- Designed and developed Data warehouse, Data marts, star schemas, fact table, Dimension tables, Slowly Changing Dimension, Staging areas, Operational Data Store.
- Involved in designing, development and testing of Interfaces to communicate with third party data.
- Managing Oracle 11g databases for daily data loads
- Worked with ETL team responsible for Designing, modeling, and creating ETL procedures to load bulk data into the highly denormalized database.
- Designed ETL specification documents to load the data in target using various transformations according to the business requirements.
- Worked with reporting team in generating various reports using BO XI and helped them in providing the data according to their requirement.
- Involved in designing Universes incorporating all the business requirements and creating hierarchies to support drill down functionalities in reports.
- Created Unix Shell Scripts for Data Centric applications
- Developed shell scripts for scheduling the jobs.
- Extensively used Shell Scripts for loading the data. And monitoring all BULK data loads.
Environment: Oracle, Erwin, SQL server, Informatica power center 9.5/9.1,IDQ,IDE,Power exchange, BO XI,OBIEE, Flat files, Tivoli, Perl, UNIX and Shell Scripting
Confidential, Plymouth, MN
Data Analyst/ETL developer
Responsibilities:
- Gathered Business requirements from the end users and reporting team
- Participated in the entire life cycle of the project, which involved understanding scope of the project, functionality, Data modeling, technical Architecture document, technical design and complete development.
- Extensively used ERWIN to design Logical/Physical Data Models, forward/reverse engineering, publishing data model to acrobat files.
- Designed and developed Operational Data Store, Slowly Changing Dimension, Staging areas etc.
- Designed Informatica Architecture for development, integration and system test environments
- Involved in extraction of data from various sources like flat files, Oracle and SQL Server
- Associated with design and development of around 40+ mappings
- Extensively worked with SQL queries. Created Stored Procedures, packages, Triggers, Views using PL/SQL Programming.
- Optimized the performance of the Informatica mappings
- Validated complex mappings involving Filter, Router, Expression, Lookup, Update Strategy, Sequence generator, Joiner and Aggregator transformations.
- Used debugger to test the mapping and fixed the bugs. Created and used Debugger sessions to debug sessions and created breakpoints for better analysis of mappings.
- Created mapping variables and parameters and used them appropriately in mappings.
- Extensively used all the features of Informatica 8.6 including Designer, Workflow manager and Repository Manager, Workflow monitor.
- Designed and developed several ETL scripts using Informatica, UNIX shell scripts
- Extensively used UNIX and FTP Scripts for triggers to kick off the TIBCO process once data is loaded in the outbound table.
- Implemented Slowly Changing dimension methodology (type 2) for accessing the full history of accounts and transaction information.
- Did QA of ETL processes, migrated Informatica objects from development to QA and production using deployment groups.
- Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions.
- Designed and Developed pre-session, post-session routines for Informatica sessions to drop and recreate indexes and key constraints for Bulk Loading.
- Involved in Performance Tuning at various levels including Target, Source, Mapping and Session for large data files.
- Co-ordinated with the off-shore teams and mentored junior developers.
Environment: Informatica Power Center 9.x, Oracle 11g, DB2, MS SQL Server, Erwin, TOAD, PL/SQL, Shell Scripting, UNIX.
Confidential, Vineland, NJ
Data Analyst/ETL Developer
Responsibilities:
- Facilitated collection of functional requirements from system users and preparation of business requirement documents using Rational Requisite Pro that provided appropriate scope of work for technical team to develop prototype and overall system.
- Played a pivotal role in JAD sessions helping to develop detailed solutions and documented the Functional and System specification requirements.
- Worked as an Interface between the end users and the different teams involved in the application development for the better understanding of the business and IT processes.
- Followed a structured approach to organize requirements into logical groupings of essential Functional Requirements, Non Functional Requirements, Interface Requirements and Report Requirements and insures that critical requirements are not missed.
- Conducted GAP analysis to regulate the gaps in the requirements and conveyed to various modules of the project.
- Prepared Business Process Models that includes modeling of all the activities of the business from the conceptual to procedural level.
- Prepared Workflow diagrams to analyze AS IS and TO BE scenarios, designed new process flows using MS Visio and documented the business process and various business scenarios.
- Performed quality audits of the requirements and distributed to all stakeholders. Conducted user interviews and JAD Sessions as a part of the requirements elicitation.
- Documented Test Plans that contains test scripts, test cases, test data and expected results for the Integration, Functional, Performance, and User Acceptance Testing.
- Worked on source system analysis for different source systems like JD Edwards, QuickBooks, SAP etc.
- Gathered all transaction fields and data from all these different source systems.
- Worked on data profile and data definition after gathered all fields from sources.
- Worked on Source fields to Target fields mapping and put the transformation rules for each fields.
- Created the data definition in Excel for each and every field for all source systems which came out from different Database Engines.
- Created Source Mapping Excel Sheet for our group where all source and target fields are mapped and all business rules are applied to map all fields.
- Also worked on Intercompany Transaction activity for reconciliation process.
- Created SQL complex queries and SQL performance tuning.
- Team Building, creating production and analysis report, documentation for various data processes.
- Good experienced in communicating with different source persons, scheduling meeting with Source Business Person as well as Source Technical Persons.
- Created final source approval and Gap/Recommendation Technical Document for each and every source systems.
- Designed some Source system related diagrams using Microsoft Visio and prepared presentation on that.
- Prepared documented reports for the usage of existing systems and new systems.
Environment: MS Office Suite (Word, Excel, Access, PowerPoint, and Outlook), MS-Project, Visio, MS-Excel, MS Office Suite. Informatica 8.x, Oracle 10g, DB2, SQL Server 2008/2005
Confidential
ETL Developer
Responsibilities:
- Identifying the information needs within and across functional areas of an organization.
- Modeling the process in the enterprise wide scenario.
- Identify the integration points between functional areas in an enterprise wide model.
- Identifying benefits of cross-functional integration in an enterprise-wide system.
- Managed database optimization and table-space fragmentation.
- Tuned the performance of queries by working intensively over indexes.
- Used the dynamic SQL to perform some pre and post session task required while performing Extraction, transformation and loading.
- Created different source definitions to extract data from flat files and relational tables for Data mart.
- Created different target definitions.
- Created, modified, deployed, optimized, and maintained Business Objects Universes using Designer.
- Designing the ETL process using Informatica in order to populate the Data Mart using the flat files to Oracle database.
- Writing Unix shell scripts to automate the processes Performed testing and QA role: developed Test Plan, Test Scenarios and wrote SQL Plus Test Scripts for execution on converted data to ensure correct ETL data transformations and controls.
- Created complex mappings in order to populate the data in the target with the required information.
- Created work flows and sessions in order to perform the required transformation
Environment: Oracle 9i/10g, Informatica 8.x, Business Objects, Unix