We provide IT Staff Augmentation Services!

Data Analyst Resume

2.00/5 (Submit Your Rating)

Irving, TX

SUMMARY

  • Over 7+ years of IT experience with three years of professional work experience in Data Modeling associate with Data Warehouse/Data Mart design, System analysis, development, Database design, SQL, and PL/SQL programming, Data Analyst and ETL Development.
  • Proficient in the design and development process required to ETL (Extraction, Transformation, and Loading) using Informatica 5.x/6.x/7.x.
  • Experienced in creating robust Mappings, Mapplets, and Reusable Transformations using Informatica Power Center and Power Mart.
  • Worked on different data source and target databases like Oracle, SQL Server, Teradata, DB2, MS Access, MS Excel, Flat Files, and XML.
  • Experienced in creation of complex mappings using Filter, Aggregator, Expression, Router, Lookup, Update Strategy, Sequence generator, and Rank transformations.
  • Experienced in using Informatica Workflow Manager to create and schedule workflow and Worklets.
  • Experience in Unix Shell Scripting, scheduling Cron jobs and also Job Scheduling on Multiple platforms like Windows NT/2000/2003, UNIX.
  • Used Debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Exposure to data modeling, normalization, business process analysis, reengineering, and logical and physical database design by using Erwin.
  • Worked on Business Objects XI to create Universe, classes, objects, and hierarchies for multi - dimensional analysis for a financial institution.
  • Worked with Excel, VBA, Pivot Tables and other data tools.
  • Created various types of reports and charts for Decision Support System (DDS) by using Business Object XI. Also, created complex and sophisticated reports that had multiple data providers, drilldown, slice and dice features.
  • Created PowerPlay cubes using PowerPlay Transformer in Cognos8 to permit multi-dimensional cross sectioning and drill down analysis. Also, used multiple queries, calculated measures, customized cube content and optimized cube creation time.
  • Created several different types of ad-hoc reports and professional reports in Cognos8 and represented data graphically by charts utilizing Query Studio.
  • Created several different types of Professional Reports such as Mailing Labels, Web Page Style, Cascading, Linked / Drill Through, Production, Business, Ranking etc with complex formatting utilizing Report Studio.
  • Experienced in complete Software Development Life Cycle (SDLC) which includes business requirements gathering, system analysis and design, data modeling, development, testing, migrating and production supporting of Data Warehouse and Business Intelligence Applications. Knowledge of full life cycle development for building a Data Warehouse.
  • Excellent problem solving skills with a strong technical background and good interpersonal skills. Quick learner and excellent team player, ability to meet tight deadlines and easily work under pressure.

TECHNICAL SKILLS

ETL Tools: Informatica (Power Center 8.6/7.1.2/6.2 ), Data stage

Operating Systems: Windows 2000/NT/98/95, Sun Solaris 2.x, HP-Unix, Red Hat Linux 7.2/7.3

DBMS: Oracle 7/8i/9i,10i (SQL, PL/SQL, Stored Procedures, Functions), MS-Access 2002/2003, Teradata V2R4.1/V2R5, SQL Server 2000/7, DB2

Data Base Tools: Toad, SQL Navigator 4.0, SQL Loader

Data Modeling: Erwin 7.3, Power Designer

Reporting tools: Business objects, Cognos8, Cognos8 Report net, SSRS

Scripting: Unix Shell Scripting

Web Technology: HTML, CSS, Dream Weaver, Java Script, MDX Script

Languages: C, C++, Java, C# 4.0, HTML, PL/SQL

PROFESSIONAL EXPERIENCE

Confidential, Irving, TX

Data Analyst

Responsibilities:

  • Worked on analyzing business requirements and converting them in functional and technical requirements for a new integrated new data warehouse with several data marts for the whole corporation.
  • Involved in replicate operational tables into staging tables, transform and load data into enterprise data warehouse using Informatica form their legacy systems and load data into targets by ETL process through scheduling workflows.
  • Develop various modules for ICD data for correct billing analysis including ICD 9 and ICD10 conversion.
  • Used different ETL tools such as Informatica, SSIS to standardize data from various sources and load into Data Stage area of the Oracle Database.
  • Involved in Schema design of several data marts and enterprise data warehouse using Erwin. Also, developed Logical and Physical data models that captures current state/future state data elements and data flows using Erwin data modeler.
  • Worked in variety of transformations including Source qualifier, Expression, Aggregators, lookups, Filters, Sequence, Rank, Joiner, Router, Union, and XML Generator used in appropriate situations depending upon the requirement of the data transformation.
  • Developed Pivot tables to analyze and summarize the data in Microsoft Excel.
  • Created Complex Mappings using Unconnected Lookup, Sorter, Aggregator, Dynamic Lookup, and Router transformations for populating target table in efficient manner.
  • Created robust Informatica mappings with stored procedures/Functions to build business rules to load data.
  • Worked on complex Aggregate, Join, Look up Transformations to define business rules to generate consolidated data identified by dimensions using Informatica Power Center tool.
  • Implemented Slowly Changing Dimensions and created Type1, Type2, and Type3 tables for different data backup requirements.
  • Called stored procedures to perform database operations on post-session and pre-session commands.
  • Created partitions to concurrently load the data into sources and loaded bad data using reject loader utility.
  • Regularly performed Unit Testing and constantly looking for better performance tuning.
  • Designed and Developed UNIX Shell Scripts to create automatic scheduling of jobs using Corntab.
  • Involved in creation of Universes by using Business Objects to support new reporting requirements of business users.
  • Worked on creating Web Intelligence, Dashboard, Desktop Intelligence and Crystal Reports using Universes, Classes, and Objects.
  • Created complex and sophisticated reports that had multiple data providers such as Oracle, Teradata, SQL Server, Access and Sybase. Also, created reports using Business Objects functionality like combined queries, slice and dice, drill down, functions, cross tab, master detail and formulae.
  • Used SQL tools like TOAD to run SQL queries and validate the data pulled in BO reports.
  • Created PowerPlay cubes using PowerPlay Transformer in Cognos8 to permit multi-dimensional cross sectioning and drill down analysis. Also, used multiple queries, calculated measures, customized cube content and optimized cube creation time.
  • Involved in creating several different types of ad-hoc reports and professional reports in Cognos8 and represented data graphically by charts utilizing Query Studio.

Environment: Teradata 13.0, Informatica Power Center 8.6, Business Objects 5.1, Cognos8, Oracle (8i, 9i, &10i), PL/SQL, TOAD, SQL 2000, MS Excel, Pivot Table, Windows NT/XP

Confidential, Chicago, IL

Data Analyst

Responsibilities:

  • Involved in analyzing and identifying improvements in the business requirements and converting them into functional requirements for the future system.
  • Worked on to understand how the merger company’s ETL processed and finding the way to integrate that system to our current systems.
  • Used various types or ETL tools used by our system to standardize data from various sources and load into Data Stage area, which was in Oracle.
  • Developed Logical and Physical data models that captures current state/future state data elements and data flows using Erwin data modeler.
  • Informatica Designer tools were used to design the source definition, target definition and transformations to build mappings.
  • Worked with MS Excel 2010 to summarize, explore, analyze and present using Pivot tables for various mortgage origination systems.
  • Largely worked on various types of data sources such as flat files, XML and load them into oracle DBMS using ETL tools used by the company.
  • Worked on Informatica Power Center (Source Analyzer, Target Designer, Mapping and Mapplet Designer, Workflow Designer) and various types of Transformations.
  • Created and implemented various types of mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy.
  • Mostly used PL/SQL Procedures/Functions to build business rules.
  • Implemented store procedures, created and scheduled sessions based on demand, run on time, and run only once using Workflow Manager.
  • Designed and Developed complex aggregate, join, look up transformation rules (business rules) to generate Consolidated (Fact/Summary) data identified by dimensions using Informatica Power Center tool.
  • Extensively worked on Workflow Monitor to monitor Workflows and Sessions. Also, used Workflow monitor to figure out mistake made during transformation by analyzing session log.
  • Extensively worked in the performance tuning of programs, ETL procedures and processes.
  • Extensively used Debugger in troubleshooting of the existing mappings.
  • Worked on creation of PowerPlay Cubes in Cognos8 and involved in creation of several different types of ad-hoc reports and professional reports by using Query Studio and report studio.
  • Performed good Documentation for all mappings and workflows done during the development process.

Environment: Oracle 9i& 10g/11g, Erwin, SQL Server 2008, UNIX, Informatica-Power Center 9.5/8.6, Business Objects XI, Cognos8, SQL, PL/SQL, Windows NT/XP, Pivot Tables, MS Excel 2010.

Confidential, Grand Rapids, MI

Data Analyst

Responsibilities:

  • Involved in the projects from requirement analysis to better understand the requirements and support the development team with the better understanding of the data
  • Initiated and conducted JAD sessions inviting various teams to finalize the required data fields and their format.
  • Worked extensively in creating the DMR Mapping Spreadsheets, Source to Target Mapping documents.
  • Responsible for writing SQL Queries, SQL Joins, and SQL command for mapping validation.
  • Analyzed the transformation rules and Joins/Filter logic documents.
  • Worked as a member of the Data lineage Team in the Enterprise Data Program division to create the Data Functional Design Documents.
  • Reverse Engineered databases to understand the relationship between already existing tables.
  • Created an initial data model (conceptual) to in corporate the new data elements for new functionality.
  • Created Logical/Physical data models from the conceptual model identifying the cardinality between the tables.
  • Reviewed SQL queries for data dictionary and data manipulation.
  • Worked with ALM tools for defect tracking and data quality issues.
  • Prepared the Joins/Filter logic documents that would help the ETL design team perform the Joins on the tables that are in the form of flat files before loading them to FDS or any downstream systems.
  • Generated DDL scripts for the developed physical model.
  • Created Source to target Mapping Matrix for the ETL developers in loading the data to Dimensions and Fact Tables using SAS.
  • Worked with data modelers to handle the modeling changes to legacy and future databases, using Power Designer and implementing them on Development and test databases and moving the same to Production Instances during the production release.
  • Worked with Central Distribution Hub (CDH) team in developing strategies to handle data from EO to CDH and then from CDH to downstream systems.
  • Created use case scenarios to test performance of the Finance Data Store.
  • Worked with the CCG (Consumer and Credit group) team to develop a star schema for the CCG - Collections Data Mart identifying the Fact and Dimension tables.
  • Involved in Semantic layer design. Building views based on requirements given by users.

Environment: Oracle 10g, Teradata, Power Designer 15.0, SSRS, SSIS, MS Excel, Windows NT/2000, ALM.

Confidential

Data Analyst

Responsibilities:

  • Designed and created various Test Plans, Test Cases based on the Business requirements.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Used SQL for Querying the DB2 database in UNIX environment
  • Involve with Design and Development team to implement the requirements.
  • Performed execution of test cases manually to verify the expected results.
  • Used TOAD Software for Querying Database.
  • Tested mappings with the Design Documents and also performed testing for various sample data.
  • Tested mappings transformations like Filter, Joiner, Sequence Generator and Aggregator and perform query overrides in Lookup transformation as and when required to improve the performance of the mappings.
  • Worked extensively with Informatica tools such as Source Analyzer, Warehouse Builder, Mapplet Designer, Mapping Designer and Workflow Manager.
  • Co-ordinate with team members for effective management for all project requirements and responsible for deliverables and receivables during all project phases.
  • Participated in walkthrough and defect report meetings periodically.
  • Involved in Unit, Functional, Regression and System testing.
  • Worked on profiling the data for understanding the data and ETL mapping document.
  • Performed the tests in both the SIT, QA and contingency/backup environments
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Used SQL for Querying the DB2 database in UNIX environment
  • Performed periodic checks to run crosscheck against QA/SIT/PROD environments to ensure it is up and running.
  • Created test plans and performed unit testing for the Ab Initio graphs and its components.
  • Optimizing/Tuning several complex SQL queries for better performance and efficiency.
  • Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.
  • Worked on issues with migration from development to testing.
  • Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
  • Documented and reported various bugs during Manual Testing.
  • Retesting the resolved defects/issues after the developers fix the defects
  • Performed Installation testing and Performance testing manually.

Environment: Informatica Power Center 6.x/7.x(designer, Workflow Manager, Workflow Monitor, Repository, Oracle 9i, UNIX 5.0, Windows NT, Shell Scripting, SQL, PL/SQL, Sql Server 2000, ERWIN, Informatica Power Connect

We'd love your feedback!