We provide IT Staff Augmentation Services!

Data Analyst/etl Developer Resume

3.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • 6+ Years of IT experience in Business Intelligence Data Modeling, ETL Design, Development, Programming, Testing, Performance tuning, Implementation and Troubleshooting and error handling in the field of Data warehousing and gathering Business User Requirements.
  • Strong Data warehousing experience specializing in RDBMS, ETL Concepts also performed ETL procedure to load data from different sources into data warehouse using Informatica Power Center (Repository Manager, Designer, Workflow Manager and Workflow Monitor).
  • Expert in the ETL Tool Informatica. Had extensive experience in working wif various Power Center Transformations using Designer Tool.
  • Worked on databases like Oracle, DB2, Netezza, SQL and MS Access and also has worked on tools like Toad, SQL Developer and BMC Remedy.
  • Worked on Industry & models like IBM BDW and FSLDM.
  • Extensive experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Normalizer, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy..
  • Extensively worked wif Slowly Changing Dimensions (Type I, Type II and Type III).
  • Worked wif cross - functional teams such as QA, DBA and Environment teams to deploy code from development to QA and from QA to Production server.
  • Solid experience in writing SQL queries, Stored Procedures, Cursors, Indexes, Views, Sequences and Triggers.
  • Experience on Debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Experience in Business Objects 5.x/6.x (Designer, Reporter, Supervisor, Broadcast Agent, Web Intelligence), and modeling tools such as Erwin.
  • Business Intelligence experience in Cognos, Cognos Impromptu, Cognos IWR (Impromptu Web Reports), Cognos Power Play Transformer, Cognos Upfront.
  • Excellent leadership, organization skills and requirement gathering skills like JAD.
  • Experience in Database such as DB2, Teradata, Netteza,Oracle 11g/10g/9i/8i, SQL Server 2000/2005 on UNIX, Windows 2000/NT/98/95 platform and Sun Solaris.
  • Experienced in writing UNIX Shell scripts, Perl scripts, Toad and PL/SQL Stored Procedures, Triggers and Functions.

TECHNICAL SKILLS:

Data Warehousing / ETL Tools: Informatica 9.6/9.5/9.1/8.6/8.1/7.6 (Repository Admin Console, Repository Manager, Designer, Workflow Manger, Workflow Monitor, Power Exchange)

Data Modeling: Erwin, MS Visio, Star Schema Modelling, Snow Flake Modelling

Databases: Oracle 11g/10g/9i/8i, Universal DB2 UDB, Teradata, Sybase, MS SQL Server, MS-Access.

Languages: C, C++, JAVA, SQL, PL/SQL, T-SQL, BTEQ Scripts, Unix Shell Script

Tools: Toad, SQL* plus, SQL Assistant 13, Fast Load, Multi Load, Fast Export, Putty

Operating Systems: Windows 95/98/NT/XP/2000/2003, Sun Solaris, Linux

PROFESSIONAL EXPERIENCE:

Confidential

Data Analyst/ETL developer

Responsibilities:

  • Analyzed the source data coming from Legacy system, Oracle, DB2, sql server, PeopleSoft and flat files. Worked wif Data Warehouse team in developing Dimensional data Model.
  • Co-ordinated wif different application team and Business Analytics to develop and standardize enterprise wide data model, created dimensional data model using Ralph Kimball Methodology, Designed and developed fact tables, dimension tables, conformed fact and dimension tables.
  • Analyzing the of data which is extract from source table and transform to the target table by using the sql query.
  • As an Architect responsible for Designing, modeling, and creating database and Normalizing or denormalising data according to business requirements and Creating Star and snowflake schemas.
  • Extensively interacted wif user for requirement gathering, prototyping and prepared various documents like Interface Requirement Document, Customer Requirement document, Integration test plan, Unit test plan, Release notes etc.
  • Extensively worked on Analyzing data, Applying business rules, designing the data warehouse, creating data mart/ schemas, partitioning the data, creating easy/quickest ETL process to load the data and automating the ETL process.
  • Participated in Complete formal Design process from initial specifications and requirement. Involved in creating technical design documentation.
  • Involved in preparing Business model diagrams, flowcharts, process/work-flow diagrams, data flow and relationship diagrams using MS-Visio, ERWIN and data-models showing process mappings, screen designs, use case diagrams.
  • Reviewed the conceptual EDW (Enterprise Data Warehouse) data model wif business users, App Dev and Information architects to make sure all the requirements are fully covered.
  • Extensively used ERWIN to design Logical/Physical Data Models, forward/reverse engineering, publishing data model to acrobat files and Data Cleansing.
  • Documented logical, physical, relational and dimensional data models. Designed the Data Marts in dimensional data modeling using star and snowflake schemas.
  • Designed and developed Data warehouse, Data marts, star schemas, fact table, Dimension tables, Slowly Changing Dimension, Staging areas, Operational Data Store.
  • Implemented slowly changing dimensions (SCD) from type 1 to type 3 and done wif data mapping.
  • Implemented Relational Model and Dimensional Model for Data Marts and generated DDL scripts Using Erwin, Power Designer tools and have implemented Forward and Reverse Engineering.
  • Worked wif ETL team responsible for Designing, modeling, and creating ETL procedures to load bulk data into the highly de-normalized database.
  • Designed ETL specification documents to load the data in target using various transformations according to the business requirements.
  • Worked wif reporting team in generating various reports using BO XI and halped them in providing the data according to their requirement.
  • Involved in designing Universes incorporating all the business requirements and creating hierarchies to support drill down functionalities in reports.
  • Created Unix Shell Scripts for Data Centric applications
  • Developed shell scripts for scheduling the jobs.
  • Extensively used Shell Scripts for loading the data. And monitoring all BULK data loads.

Environment: Oracle, Erwin, SQL server, Informatica power center 9.6/9.5,IDQ,IDE,Power exchange, BO XI,OBIEE, Flat files, Tivoli, Perl,Trilluim, UNIX and Shell Scripting

Confidential

Data Analyst/ETL developer

Responsibilities:

  • Analyzed the source data coming from Legacy system, Oracle, DB2, sql server, PeopleSoft and flat files. Worked wif Data Warehouse team in developing Dimensional data Model.
  • Co-ordinate wif different application team to develop and standardize enterprise wide data model, created dimensional data model using Ralph Kimball Methodology, Designed and developed fact tables, dimension tables, conformed fact and dimension tables.
  • Extensively interacted wif user for requirement gathering, prototyping and prepared various documents like Interface Requirement Document, Customer Requirement document, Integration test plan, Unit test plan, Release notes etc.
  • Extensively worked on Analyzing data, Applying business rules, designing the data warehouse, creating data mart/ schemas, partitioning the data, creating easy/quickest ETL process to load the data and automating the ETL process.
  • As a team lead worked closely wif the entire team member to assign the work and get the updated status of work as well as halped them to complete their job wherever required.
  • Extensively used ERWIN to design Logical/Physical Data Models, forward/reverse engineering, publishing data model to acrobat files.
  • Designed and developed Data warehouse, Data marts, star schemas, fact table, Dimension tables, Slowly Changing Dimension, Staging areas, Operational Data Store.
  • Involved in designing, development and testing of Interfaces to communicate wif third party data.
  • Managing Oracle 11g databases for daily data loads
  • Worked wif ETL team responsible for Designing, modeling, and creating ETL procedures to load bulk data into the highly denormalized database.
  • Designed ETL specification documents to load the data in target using various transformations according to the business requirements.
  • Worked wif reporting team in generating various reports using BO XI and halped them in providing the data according to their requirement.
  • Involved in designing Universes incorporating all the business requirements and creating hierarchies to support drill down functionalities in reports.
  • Created Unix Shell Scripts for Data Centric applications
  • Developed shell scripts for scheduling the jobs.
  • Extensively used Shell Scripts for loading the data. And monitoring all BULK data loads.

Environment: Oracle 11g, Erwin, SQL server, Informatica power center (Designer 9.x Repository Manager 9.x, Workflow Manager 9.x),Informatica Power exchange, BO XI,BO DI, Hyperion, Flat files, Tivoli, Perl, UNIX and Shell Scripting

Confidential

Data Analyst/ETL developer

Responsibilities:

  • Liaise wif business subject matter experts in analyzing business requirements and translating them into detailed conceptual data models, process models, logical models, physical models and schema development in the database. Model & Architect DW design, direct and/or execute the technical characteristics of the overall strategy of the data warehouse and ETL process in Informatica.
  • Architect the database schema and implement dimensional model (star schema). Review and maintain the schema, its tables, indexes, views and PL/SQL procedures in Oracle 10g.
  • Map source system data elements to target system and develop, test, and support extraction, transform and load process.
  • Developed mappings to extract the Data from multiple Flat Files and loading them into staging through CDM database.
  • Worked on Informatica Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet Designer.
  • Developed complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy Rank and Router transformations.
  • Integrated Informatica Data Quality wif Informatica Power Center.
  • Installed IDQ on an informatica power center server machine.
  • Used Informatica Power Exchange& to handle the change data capture (CDC) data from the source and load into Data Mart by following Slowly Changing Dimensions (SCD) type II process.
  • Involved in modifying existing Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica source and Target Data.
  • Analyzed source systems (SQL server, XML and COBOL files) from third party vendor, these files were the source to the Informatica interfaces used to load data to the target downstream systems.
  • Created Data dictionary using ERWIN modeler.
  • Involved in Low level Design for the scripts of the database sequences, constraints, triggers and stored procedures.
  • Knowledge of Merging several flat files into one XML file.
  • Creating Low level documents for creating maps to load the data from the ODS through the warehouse.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica Mappings and Sessions for optimum performance.
  • Created summarized tables, control tables, staging tables to improve the system performance and as a source for immediate recovery of Teradata database.
  • Optimizing the maps by performance tuning.
  • Implemented integration of Datamart wif Weblogic Server for creating connection pools and Data Sources of Oracle, SQL drivers.
  • Involved in Unit Testing, User Acceptance Testing (UAT) to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Using Hyperion Explorer to generate test case scripts validating the functional data.
  • Preparing and using test data/cases to verify accuracy and completeness of ETL process.

Environment: Informatica Power center 9.5/9.1, Informatica Power Exchange,Oracle 11g,Netezza, Toad, DB2, Linux, Cognos, Flat Files, Hyperion.

Confidential - Chicago, IL

Informatica Developer

Responsibilities:

  • Designed the ETL processes using Informatica Power center 9, 8.6.1 to load data from MS SQL Server, MS Access and Excel spreadsheet into the target Oracle database
  • Extensively worked wif the business and data analysts in requirements gathering and to translate business requirements into technical specifications.
  • Prepare detailed design documentation by adhering to CMMI standards.
  • Load monthly load, track issues and resolve them based on the priority.
  • Extracted the data from the flat files, CSV files and other RDBMS databases like DB2 into staging area and populated onto Data warehouse.
  • Used Informatica Power Connect for SAP to pull data from SAP R/3.
  • Developed number of complex Informatica mappings, Mapplets, reusable transformations to implement the business logic and to load the data incrementally.
  • Developed Informatica mappings by usage of Aggregator, SQL Overrides in Lookups, Source filter in Source Qualifier, and data flow management into multiple targets using Router transformations.
  • Worked wif Informatica power exchange tools to give on demand access to the business users.
  • Designing and creating data quality plans as per the requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Designed and built Relational Databases.
  • Used Workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process.
  • Monitor and troubleshoot batches and sessions for weekly and Monthly extracts from various data sources across all platforms to the target database.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Tested the data and data integrity among various sources and targets. Associated wif Production support team in various performances related issues.
  • Developed UNIX shell scripts to move source files to archive directory.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Involved in Unit, Integration, system, and performance testing levels.
  • Created documentation for all the mappings and uploaded documentation in company’s web portal

Environment: Informatica PowerCenter 9.1, IDQ, IDE, Power Exchange,Ms SQL Server, Oracle 11G, SQL, PL/SQL, SQL* Loader, Erwin, TOAD 7.0, Star Schema, UNIX Shell Scripts, Flat files, Windows server 2003, and MS-Office tools.

Confidential

ETL Developer

Responsibilities:

  • Configured Informatica Repository Manager to create user groups and user profiles.
  • Granted administrator rights and assigned security & privileges to user groups.
  • Designed & executed SQL, PL/SQL, stored procedures, triggers according to need of database environment.
  • Deployed Informatica tools - Power Center, Workflow Manager and Workflow Monitor.
  • Utilized the Mapping Designer tools - Source Analyzer and Warehouse Designer to create and/or import the source and target database schemas.
  • Used the Mapping Designer to map the sources to the target and to create the various transformations.
  • Used Source Analyzer to work on SQL override for performance tuning.
  • Worked extensively on different types of transformations like expression, filter, aggregator, rank, lookup, stored procedure, sequence generator, and joiner etc.
  • Used Workflow Manager to create workflows and sessions to run mappings and load data into tables, warehouses or just folders in the form of reusable files.
  • Created and executed UNIX shell scripts and PL/SQL procedures, pre-session and post-session scripts to ensure timely, accurate processing and ensure balancing of job runs.
  • Used Erwin to visualize and organize data for future managing and programming of the same.
  • Supported users wif word processing files, spreadsheets, and presentation software.

Environment: Informatica Power center 8.x, Oracle 10g, Erwin, PL/SQL, SQL* Loader, SQL Server, UNIX, Windows NT, MS Excel.

We'd love your feedback!