Data Analyst Resume Profile
Dayton, OH
Summary
- Over 8 Plus years of extensive experience in working at multiple levels of a Metadata Management project, including designing the metadata strategy, defining the metadata management architecture, managing the metadata project, mentoring metadata developer and conducting trainings
- Experience in Teradata development and design of ETL methodology for supporting data transformations processing in a corporate wide ETL Solution using Teradata TD12.0/TD13.0, Mainframe and Informatica Power center 8.6.0,9.0.1 , administration, analysing business needs of clients, developing effective and efficient solutions and ensuring client deliverable within committed deadlines
- Administering large Teradata database system in development, staging and production.
- Involved in various stages of Software development life cycle.
- Strong hands on experience using Teradata utilities FastExport, MultiLoad, Fast Load, Tpump, and BTEQ Query Man .
- Skilled in Data Warehousing Logical and Physical Data Modeling using Star Schema and Snowflake Schema.
- Experience in Banking and Credit Card industry business processes.
- Technical and Functional experience in Data warehouse implementations ETL methodology using Informatica Power Center 9.0.1/8.6/8.1/7.1, Teradata
- Good knowledge of Teradata RDBMS Architecture, Tools Utilities
- Sourced data from disparate sources like Mainframe Z/OS, UNIX flat files, IBM DB2 and loaded into Teradata DW.
- Strong Experience in Designing end-to-end ETL framework and strategies to handle Re-startability, Error handling, Data reconciliation, Batch processing and process automation
- Strong experience in Data analysis, Business rules development, data mapping and translating business requirements into technical design specifications
- Good understanding of Data Modelling Star and Snow Flake Schemas , Physical and Logical models, DWH concepts
- Good experience in setting up real time Change Data Capture process using IBM Change Data Capture CDC tool sourcing DB2 database and targeting Netezza database
- Good experience in Data profiling and Data analysis using IBM InfoSphere Information Analyzer
- Good understanding of SCRUM methodology and Sprint planning to achieve the project goals and milestones effectively and efficiently
- Experienced in developing Data warehouse/Data Marts, Star Schema, Snowflake Schema and changing Dimensions.
- Very good knowledge of Budgeting, Forecasting and Financial planning, created Balance Sheet, Income Statement and Cash Flow Statement, Multiple currency using Hyperion Planning
- Extracted source data from Mainframe Z/OS using JCL scripts and SQLs into Unix Environment and Created a formatted reports for Business Users using BTEQ scripts
- Strong Teradata SQL, ANSI SQL coding skills.
- Expertise in Report formatting, Batch processing, Data Loading and Export using BTEQ.
- Did the performance tuning of user queries by analysing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes
- Well versed with Teradata Analyst Pack including Statistics Wizard, Index Wizard and Visual Explain.
- Used extensively Derived Tables, Volatile Table and GTT tables in many of the BTEQ scripts.
- Involved in designing and building stored procedures, view generators and macros for the module.
- Worked on moving data to dedicated environment for Analytics for better performance. Involved in
- Analysis, Administrative, Security, Development and Support.
- Expertise in Procedures, Functions, Database Packages and Triggers. And experience in trouble shooting techniques, tuning the SQL statements, Query Optimization and Dynamic SQL.
- Good knowledge of Performance tuning, Application Development and Application Support on UNIX, MVS and WINDOWS NT Environments
- Developed UNIX Shell scripts for Batch processing.
- Responsible for writing Deployment/Release Notes before the project release and End User's Manual for production team.
- Trained in Hadoop and worked on proof of concept to convert application from Mainframe Teradata to Hadoop.
TECHNICAL SKILLS
RDBMS : Teradata, MS Access, DB2
Query Tools : Teradata SQL Assistant, SQL Plus
ETL Tools : Informatica Power Center, Mainframe JCL
Scheduling Tools : Autosys, CA7
Scripting : UNIX Shell Scripting
Languages : SQL, JCL, COBOL, XML
BI ETL Tools : Informatica 8.6 PowerCenter, PowerExchange, Metadata Manager ,
IBM InfoSphere 8.7 DataStage, Business Glossary, Metadata Workbench,
Reporting Tools : Hyperion Web Analysis 9.3.1, Brio 6.x, Interactive Reporting, Smart View,
Hyperion SQR, Financial Reporting, Hyperion spreadsheet Add In, Cognos,
Data Reports, Business Objects 5.0, Micro Strategy 8.2, MS Access Reports
Data Modeling : Dimensional Data Modeling, Data Modeling, Star Schema Modeling,
Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical
Data Modeling, Erwin 3.5.2/3.x/6.0, TOAD
GUI : MS Office Suite, Visual Basic
Documentation : MS-Office 2003/07/10, MS-Visio, IBM Lotus Notes
Operating Systems : Windows XP, Windows NT/98/2000, UNIX
Experience:
Confidential
Data Analyst
- Responsible for creating ETL design specification document to load data from operational data store to data warehouse.
- Develop FASTLOAD and MLOAD scripts to load data from legacy system and flat files through Informatica to Teradata data warehouse.
- Involved with Business analysts to Provide Business Performance Data using Teradata, Oracle, SQL, BTEQ, and UNIX.
- Building up VBA Macro for Processes which are recurring jobs to pull data for specific requirement.
- Created Daily, Weekly, monthly reports related by using Teradata, Ms Excel, and UNIX.
- Wrote BTEQ, SQL scripts for large data pulls and ad hoc reports for analysis.
- As per Ad-hoc request created tables, views on the top of the Finance Data mart/ production databases by using Teradata, BTEQ, and UNIX.
- Created tables, indexes, views, snap shots, database links to see the information extracted from SQL files.
- Created different set of tables Like Set, Multiset, Derived, Volatile , Macros, views, procedures using SQL scripts.
- Worked on Testing Primary Indexes and SKEW ratio before populating a table, used sampling techniques.
- Extensively worked on Unix Shell Scripts for automating the SQL scripts, checking daily logs, sending e-mails, purging the data, extracting the data from legacy systems, archiving data in flat files to Tapes, setting user access level, automating backup procedures, Scheduling the Jobs, checking Space usage and validating the data loads.
- Worked with Business analysts to Provide Business Performance Data and other data using Teradata, Oracle, SQL, BTEQ, and UNIX.
- Responsible for analysing business requirements and developing Reports using PowerPoint, Excel to provide data analysis solutions to business clients.
- Created Daily, Weekly, monthly reports related by using Teradata, Ms Excel, and UNIX.
- Wrote BTEQ, SQL scripts for large data pulls and ad hoc reports for analysis.
- As per Ad-hoc request created History tables, views on the top of the Finance Data mart/ production databases by using Teradata, BTEQ, and UNIX.
- Created tables, indexes, views, snap shots, database links to see the information extracted from SQL files.
- Worked on tables Like Set, Multiset, Derived, Volatile, Global Temporary , Macros, views, procedures using SQL scripts.
- Worked on Testing Primary Indexes and SKEW ratio before populating a table, used sampling techniques, Explain Plan in Teradata before Query Large tables with billons of records and with several joins.
- Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from multiple sources.
- Extensively worked on Unix Shell Scripts for automating the SQL scripts, checking daily logs, sending e-mails, purging the data, extracting the data from legacy systems, archiving data in flat files to Tapes, setting user access level, automating backup procedures, Scheduling the Jobs, checking Space usage and validating the data loads.
Environment: MS Excel, MS Word, Teradata 13.0, Oracle, Teradata SQL Assistant, VB, Tableau,BTEQ, Erwin ,Oracle 9i, PC SAS, MVS, UNIX Shell scripts, DB2, SQL, PowerPoint, SAS Enterprise Guide, BOBJ, Putty, Excel
Confidential
Data Analyst
- To do an impact Analysis for the New/Changed Requirements and prepare LLD Low Level Design .
- Compare Client Supplied products like BRD, HLD with the LLD to find out any incompleteness.
- Prepare UTP and UTS.
- The project's objective is to provide Development, Enhancements and Maintenance to the Data Warehouse
- Converted Excel application for above-mentioned advertising campaign to HTML, utilizing SAS ODS and Proc Template
- The warehouse is a mainframe / Informatica-UNIX and Teradata based database structure, which is designed to handle Bank of America's Customer Knowledge Decision Support activities CK DS .
- The main objective of the Warehouse is to help the Bank in taking Decisions making Strategies by providing historical data from multiple streams and help in Planning and Decision making.
- Develop the code according to the LLD using tools like the SAS-RAW DATA LOAD RDL tool
- Standardize the code using tools like the ALIGN JCL, ASA and QRC
- Perform Peer Review and Code Walkthrough
- Integrate the business, technical and operational metadata captured from various tools and create data lineage showing end to end data flow
- Design and implement metadata change management and deployment process while moving metadata from one environment to other
- Write UNIX Shell scripts and design batch process to automate the metadata import-export, run data lineage and metadata deployment process
- Assisted in setting up SAS execution platform on UNIX system.
- Perform SIT System Integration Testing and provide data to Onshore Counterpart for submitting it to the User community.
- Assist in Deployment and provide Technical Operational support during Install.
- Involve in Post implementation support.
- Brief Listing of tasks done on a weekly basis
- Ensure overall Data of all the deliverables.
- Used IPMS for all project management related quality activities.
Environment: Teradata SQL Assistant, VB, BTEQ, Oracle 9i, UNIX Shell scripts, SQL, PowerPoint, Excel.
IBM z/OS, Informatica Power Center, MVS UNIX, JCL, Abinitio, SQL
Confidential
Data Analyst
- Analysis of requirements for finding any ambiguity, incompleteness or incorrectness for enhancements and other projects assigned
- Modified VB SQL code to execute in SAS environment modified system interface code wrote report programs to replicate report layout from VB to SAS .
- To attend technical meetings for finalization of requirements and design.
- Involved in Development and enhancement of the Associate Data Load process set up.
- Standardization of the code work using tool ASA Automatic Standard Analyzer .
- Cleaning Database by eliminating duplicate records utilizing the tool DATACLEAN before preparing the test environment.
- performed reporting and data analysis using Access, Excel, and Crystal Reports.
- Wrote programs and SAS macros to assist other team members in resolving issues during conversion, and explained their use and functionality to the team.
- To find out the testing coverage of the system after each enhancement by using the tool
- TCA Test Coverage Analyzer and to provide the percentage of codes tested and also the flows that are yet to be tested.
- Mainframe based Teradata database designed to handle customer knowledge and decision support activities.
- Data Warehouse/OLAP/Reporting/Dashboard development
- Started company's movement into OLAP and data warehousing.
- This project involved data extraction from the source files for different systems like fleet ADIM , CDS, CSDB, CAP, IMPACS, MSP and load the data into the warehouse
- The process does achieve the main objective of the warehouse to help the bank in decision and strategy making process.
- Converted SQL queries and report code, from Actuate VB running on NT, to SAS running on UNIX.
- Usage of the SAS tool specially created for DW resource pool to generate the Create and insert scripts for ADIM.
- Data is loaded using ETL which is further used to develop views, extracts and to generate reports from them
- Co-ordination with Offshore for the coding work based on the approved design.
- Carried out Unit, System Integration, Regression testing for appropriate initiatives based on the test plans/scripts.
- Installation of components in production based on appropriate timelines.
- Production support activities, which would involve work in the areas of Production job monitoring, problem fixing and Persistent issue resolution. Been as the primary on call for ADIM.
- To work on technical problem tickets from the user group of the ADIM. This would involve problem tracking, analysis, follow-up with the source system based on its type.
Environment: Teradata SQL Assistant, BTEQ, Oracle 9i, Teradata, Windows NT, UNIX Shell scripts, SQL, PowerPoint, Excel,IBM Mainframe, Tableau ,MVS/ESA, WINDOWS XP,Teradata SQL, JCL
Confidential
SAS Consultant
- Responsible for correcting and maintaining daily, weekly, and monthly insurance claims and payments reports using SAS.
- Created ad hoc reports for various data analysis projects.
- Programming in UNICA Affinium, extracting lists of card members for various marketing campaigns, utilizing PC and MVS platforms for additional programming. Utilized Teradata, DB2 and SyncSort for data collection
- Developed Rules files to upload data/metadata from financial system's database to Essbase cube
- Migrated Metadata, Security reports using Life cycle management LCM from Dev to QA to Production
- Provide best practice recommendations and guidelines in the form of standards, checklists to all the development groups to capture the required metadata seamlessly
- Import technical metadata of various source target systems including Database Netezza, DB2, Oracle, SQL Server , Files VSAM, Sequential , Data Models CA ERwin , Source to Target mapping spreadsheets MS-Excel , ETL DataStage , Data Profile Results Information Analyzer , Reports Cognos into Metadata repository
- Developed data mart for sales and financial data/SSAS OLAP cubes/SSRS reports
- Assisted in training the off-shore SAS and IT teams in understanding company data and systems.
- Responsible for creating and delivering recurring as well as ad hoc marketing campaign programming within strict timelines while under constantly changing requirements using SAS, SAS SQL, and Teradata SQL in a UNIX environment with Oracle, Informix, and Teradata relational databases.
- Responsible for actively supporting the development of systems that provide monthly activity based costing reports and profitability information to departments throughout the company using SAS, JCL, PRF query creation, Teradata SQL and BTEQ in a mainframe environment crosoft Excel.
- Responsible for Risk Management processing and criteria development of Statement Mailings each month to all current U.S. Associates Consumer customers. This includes qualifying and auditing over 200 million in pre-approved offers as well as to select customers for various Marketing initiatives using SAS in a UNIX environment
- Used SAS to develop various ad hoc programs and reports as well as provide detailed monthly tracking analysis, executive summaries, and presentations to Senior Management on performance and execution of Marketing programs and trend analysis for Planning and Control issues
Environment: Teradata, SAS v8/v9, SQL Assistant 12.0, Erwin, UNIX Sun Solaris, Clearcase, Hyperion Brio ,Oracle, Tableau,QlikView, Excel, Rational Rose, Requisite-Pro, AGILE , Documentum, MS Project 2002, MS Visio, MS Word, MS Excel, Test Director, Java, COGNOS, MYSQL, Windows NT/2000. Proc SQL, PC SAS, OLAP, PL/SQL, DB. UNIX, Teradata, SQL, SAS, Oracle, MS Excel, MS Access