Senior Data Analyst Resume
Charlotte, NC
SUMMARY
- Over 7 Years of experience in Information Technology as a Data Analyst using Teradata, SAS, Oracle,SQL,DB2 and MS Access.
- Experience in developing SAS Procedures, Macros, Report formatting, Data Loading and Exporting, and Batch Processing.
- Strong knowledge of SAS Base, SAS Stat, SAS Macros, SAS Oracle, SAS Connect, SAS Access
- Expertise in writing utility macros.
- Experience in loading flat files to Teradata by writing fast load scripts.
- Experienced in Automating and Scheduling the Teradata SQL Scripts in UNIX using Korn Shell scripting.
- Knowledge of different Schemas (Star and Snow Flake) to fit reporting, query and business analysis requirements.
- Extensive knowledge of Mortgage, Banking, Financial, Retail and Campaign marketing data.
- Experience in logical and physical data modeling using tools like ERWIN, POWER DESIGNER.
- Expertise in data architecture and normalizing the data model in 1NF, 2NF, 3NF and BCNF based on the requirements and technological constraints.
- Extensively worked on Database utilities like SQL*Loader, Teradata Utilities (Fast Load, Multi Load, Fast Export, Tpump, queryman, BTEQ) and Toad.
- Strong experience in Teradata RDBMS V12, application performance tuning, Teradata SQL optimization, detailed design, application support, tuning and optimizing bad queries and handling database maintenance works.
- Experience with Merging SAS Datasets, Concatenating and Interleaving
- Developed weekly, biweekly, monthly reports by using Ms Excel, Ms Access, and Teradata SQL.
- Good Experience in scripts conversion from Oracle to Teradata.
- Extensively generated Monthly standard reporting using MS EXCEL and Pivot tables to calculate the fields’ statistical summary
- Expertise in database programming in writing of the SQL, Stored Procedures, Functions, Triggers, Views in Teradata, Oracle, DB2 & MS Access.
- Revising existing data model to accommodate new dimensions.
- Expertise in automating the batch scripts using BTEQ based on the requirement.
- Experienced in trouble - shooting techniques, tuning SQL statements, Query Optimization and Dynamic SQL
- Successfully leveraged my data analysis, complex query skills along with Ab-Initio GDE; Teradata & UNIX shell scripting skills across different projects creating tools and scripts that assure the accuracy of the data validation being performed.
- Extensive Experience in Data Warehouse testing, Database Testing, ETL (Ab-Initio), DB2, SQL Server 2000, BI Cognos Report Studio, BI Cognos Metric Studio and quality control & assurance.
- Accountable for carrying out analysis and getting issue clarifications from both Development team and Clients.
- Experience in Banking and Credit Card industry business processes.
- Detailed project planning and control including: gathering requirements, planning, documenting and analyzing the code
- Worked on several Waterfall and Agile Methodology projects spanning over different Lines of Businesses (LOBs).
- Experience in Writing BASE SAS Programs for converting Teradata table into Flat files (CSV, Fixed Format etc).
- Skilled at analyzing source data for business intelligence needs backed with strong knowledge in data warehouse testing, data migration testing, database testing, functional testing, software testing tools and Financial Market domain .
TECHNICAL SKILLS
- Data Warehouse Reporting
- Writing ETL (Extraction
- Transformation and Load) queries.
- Cognos Reports (BI)
- Ab-Initio GDE
- EME
- Shell scripting (bash korn shell).
- Teradata SQL Assistant
- MS Access
- MS Access
- DB2
- SQLDeveloper
- MS SQL Server 2008 R2
- HP ALM
- Ultraedit
- Control M
- Job Action and Control Scheduler (JACS)
- Unix
- Mainframes
- Windows
- Statistical Packages: SAS 8.x 9.x (Base SAS
- SAS/STAT
- SAS/SQL
- SAS/MACROS
- SAS/GRAPH
- SAS/ACCESS
- SAS/ODS)
- SAS Enterprise Guide.
- Testing Knowledge Base
- Ab-Inito Graph Testing
- Unix script testing
- ETL testing with UNIX and without Ab-Initio Graphs
PROFESSIONAL EXPERIENCE
Confidential, Charlotte, NC
Senior Data Analyst
Responsibilities:
- Processed data collection to ensure proper quality of data. Maintained the daily error log for cleaning the data.
- Utilized ODBC for connectivity to Teradata via MS Excel to retrieve automatically from Teradata Database.
- Data Analysis and graphical presentation for various summary reports using Base SAS and SAS/Graph facility.
- Performed user acceptance testing for new modules to UNICA system
- System was implemented using Informatica and the Warehouse in Teradata. Fast-Load, MultiLoad, BTEQ & Fast-Export were the Teradata utilities used
- Performed Report Testing using Crystal Reports by writing SQL queries to validate the data on the report and compare with the DWH data.
- Creating complex queries across auto domain, to create reports to identify potential customers for different solicitation projects while keeping the Legal requirements in mind.
- Analyzing current Reporting SQL queries to update per the new Legal requirements and testing them with positive and negative scenarios.
- Analyze data stored in the data warehouse to validate the data integrity and flow into the respective entities.
- Updated mortgage interest forecasts, home price appreciation and mortgage originations as related to credit and portfolio risk using SAS in econometric structure analysis.
- Analyzing different source systems and interfaces that interact with the Data Distribution Environment (UNIX).
- Designing universes for querying, reporting and analysis using Business Objects
- Developed new or modified existing SAS programs for the Quality Assurance System Rewrite Project in SAS/UNIX environment.
- Converted Scripts fromOracle to Teradata.
- Extracted data from the database using SAS/Access, SAS SQL procedures and create SAS data sets for statistical analysis, validation and documentation
- Creating shell scripts to capture data from different files required by the Business team for their analysis of the data.
- Creating and executing DATA Test Plans and data test strategies for the projects.
- Creating re-useable tools and artifacts to maximize ROI through standardization of data collection processes across different domains saving approx 20% of query creation time.
- Leading different data testing teams in the Mortgage, Card and Corporate Solution/ Finance domains.
- Identified multiple defects in the reports created by development teams which had significant legal impact on one of the solicitation projects.
- Development of MS Excel input forms (Lookups, VBA, Macros) and reports as a new tool to support newly instituted new hire/promotion tracking policies.
- Identified a defect with one of the existing queries which would have meant that the rewards summary report would have shown the rewards points multiplied by 100 (This is one of the typical types of error while moving data from MVS files to Non-DB2 tables).
Environment: Teradata, SAS, DB2, Ab Initio, UNIX (korn shell), MicroStrategy, Business Objects,SQL, TOAD, Hyperion Brio, VBA,HP Application Life Cycle Management, SAS Enterprise Guide, Windows XP
Confidential, San Jose, CA
Data Analyst
Responsibilities:
- Responsible for analyzing business requirements and developing Reports using PowerPoint, Excel to provide data analysis solutions to business clients.
- Created tables, indexes, views, snap shots, database links to see the information extracted from SQL files.
- Run reporting programs and download the results into EXCEL and build pivot tables
- Installed Hyperion/Brio and developed reporting queries for users
- Worked on tables (Like Set, Multiset, Derived, Volatile, Global Temporary), Macros, views, procedures using SQL scripts.
- Utilize previously-written campaigns in UNICA Campaign as well as custom SQL programming to generate lists of customers according to project specifications.
- Developed front end applications to oracle in Access as well as stand alone applications with VB code
- Managed the capturing of metadata definitions and mappings
- Worked closely with Analytics department reporting team and deployed Tableau reports and publishing them on the Tableau server.
- Developed SQL scripts for data loading and table creation in Teradata.
- Designed and deployed reports with Drill Down, Drill Through and Drop down menu option and Parameterized and Linked reports using Tableau
- Basel II preparedness and implementation to manage and assess credit portfolios
- Extensively worked on Unix Shell Scripts for automating the SQL scripts, checking daily logs, sending e-mails, purging the data, extracting the data from legacy systems, archiving data in flat files to Tapes, setting user access level, automating backup procedures, Scheduling the Jobs, checking Space usage and validating the data loads.
- Analyzing ETL, reporting requirements for data movement from files/ Operational databases to data warehouses.
- Creating queries spanning across multiple tables across the application domain in operational DBs and warehouses and create reports which meet the business needs.
- Analyzing requirements from a business perspective and communicating with the business teams to determine which business requirements can be met within the DB and which would need more data elements for proper reporting.
- Created Daily, Weekly, monthly reports related by using Teradata, Ms Excel, and UNIX.
- Analyzing data variance across different application areas (eg. Bankruptcy and Mortgage) to report variations in the data across domains so that data can be corrected.
- Identifying data integrity issues within the data warehouse (since the table linkages were logical and not physical) to identify redundant data.
- Linking of Access and Excel for analysis, data storage VBA macros and SQL for automated web search for pertinent data utilized for analysis storage and reporting.
- As per Ad-hoc request created History tables, views on the top of the Finance Data mart/ production databases by using Teradata, BTEQ, and UNIX.
- Extensive experience on SAS libraries, Metadata Links(Registration of database tables), User management, Metadata Objects, Stored Processes, Job Scheduling through Management console and Control M.
- Consolidating data stored in different databases (SQL, Oracle, DB2) into one single source of truth. It also aims at creating new interfaces for downstream applications.
- Creating MS Access queries to test the data load from different source systems/files
- Overall reporting during the execution phase of different resources
- Creating Ab-Initio graphs for testing and reviewing them
- Analyzed large datasets in Teradata data warehouse
- Interacting with client to capture information on source data for preparing data scenarios for testing
- Tested newly built data warehouse BI reports for sampling of accounts and time periods in the Cognos reports against the DB2 databases.
- Tested ETL code for movement of data from external vendors into Databases to identify missed/incorrect data elements from source to target.
Environment: Teradata, SAS, SAS Enterprise Guide, Ab Initio (GDE), DB2, COBOL, Unix (korn/ bash shell), MS SQL Server 2005, HP Quality Center 9.2, Windows XP, VBA,Mainframe, Tableau,MVS, Cognos Reports Studio
Confidential, New York, NY
Data Analyst
Responsibilities:
- Converted Business Requirements to the Functional Specification.
- Gathered and documented Application remediation requirements after talking with Application owners on how their application uses directory services.
- Developed front end applications to oracle in Access as well as standalone applications with VB code
- Used Oracle SQL developer tools for Querying source Database.
- Reviewing the test scripts (DB2 queries used to test the reports)
- Testing the ETL code (Ab-Initio)
- Root Cause Analysis of Ab-Initio graph failure using Unix
- Generating weekly status reports for the project
- Collaborated with Business Users to develop analytical approaches that met business requirements and involves translating of business requests into database queries, analyzed data, and transformed results into actionable reports.
- Transforming in various formats (MS Excel, CSV, text files) into SAS datasets.
- Using Advance MS-Excel (VLOOKUP, HLOOKUP, OFFSET, SUMPRODUCT, INDEX, MATCH, PIVOT TABLE, GRAPH), VBA, Access, SQLskills.
- Designed and developed weekly, monthly reports related to the marketing and financial departments by using SQL, MS Excel, MS Access, Teradata
- Creating Dashboards and Financial Benchmarking Models in excel or Access.
- Ad-hoc reports developed using Oracle, SQL, and UNIX.
- Extracted data from existing data stores and performed ad-hoc queries.
- Design/develop the agreed upon solution in MS Excel/ MS Access with VBA programming
- Analyzed business requirements, designing schema at right granularity base cuboids and building of master data cubes for faster and efficient reporting.
- Programming in UNICA Affinium, extracting lists of cardmembers for various marketing campaigns, utilizing PC and MVS platforms for additional programming.
- Developed SQL, BTEQ (Teradata) queries for Extracting data from production database and built data structures, reports.
- Designed test scripts for validating and reviewing MicroStrategy reports for data quality and data consistency at the end of each sprint.
- Moved SAS jobs from Mainframe to UNIX server and modified and enhanced the Codes to execute in UNIX platform.
- Wrote access, SQL and PLSQL queries to work with Oracle databases and Financial applications
- Worked on Database design and reporting. Experience working with Tableau, development of dashboards and publishing the reports. Working on Data warehousing and reporting
- Access to excel and excel to access automation. Utilized ADODB db connectivity and excel and access objects for automation of the same.
- Utilization of VBA in MS Access, Lookups, compound “if” statements and macros in MS Excel for analysis and reporting. SQL for automation
- Designed user friendly, personalized and intuitive ad-hoc complex reports for advanced business users.
- Designed personalized interactive and intuitive dashboard for analyzing operational costs of TESCO logistics.
- Extract latest accounts data from Oracle, Teradata database by using SQL, BTEQ
- Involved in various phases of Software Development Life Cycle (SDLC) of the application like requirement gathering, design, analysis, development, unit testing and deployment.
- Designed ETL batch modules using FASTLOAD, MULTILOAD, FASTEXPORT and BTEQ that are invoked through JCL.
- Created adhoc/scheduled reports for different Business groups using Tableau
- Understanding of existing conceptual data model and designing logical model data marts and implementing activity model and physical model.
Environment: SAS, SAS Enterprise Guide, Management Console, MS Excel, Access, Power point, Netezza, Teradata Administrator, Tableau, Teradata SQL Assistant, Teradata Manager,, DB2, SQL, UNIX, Visio, VBA,Unix Shell scripting, Business Objects, Teradata, VBA,Fastload, Multiload, Bteq, Informatica, IDQ
Confidential, Richmond, VA
Data Analyst
Responsibilities:
- Designed & developed various departmental reports by using SAS, SQL, and Ms Excel.
- Performed data analysis and Data mapping from source(s) system to target system(s).
- Performed detail analysis on Business requirements, Data requirements, System requirements and conduct gap analysis to meet the business needs.
- Creating pivots using VBA.
- Created Unix Scripts that uses Bteq to access Teradata Database.
- Converted Scripts from Oracle to Teradata.
- Reported quarterly campaign results for Reverse Mortgage campaigns to the business.
- Extracted raw data from excel spreadsheets and MS Access. Created input data sets and analyzed the data by creating graphs using the SAS/GRAPH facility.
- Responsible for Analyzing report requirements and developing the reports by writing Teradata SQL Queries and using MS Excel, Power Point and UNIX.
- Created pivot tables in Excel by getting data from Teradata and Oracle.
- Responsible for providing statistical research analyses and reports in support for mortgage product.
- Sets up QA/QC framework and ensures adherence and constant improvement.
- Develops business and technical knowledge of team members and identifies opportunities of improvement.
- Independently interacts with business partners to understand business need and translates to technical specifications. Assist in developing and testing new information infrastructure.
- Documenting the process, i.e. documenting all the possible information about the application like SAS programs, DATA files and source.
- Exporting data to excel from Hyperion
- Performed data modeling and financial analysis.
- Developed business cases to justify project spending, develop business requirements, and completed UAT testing in an Agile/Scrum environment.
- Defined Data Process Models using activity diagrams according to UML methodology.
- Worked with business and technical resources to understand business requirements (BRs) and identify Business Intelligence needs. Worked with numerous business units to gather, vet, and document BRs. Ensured BRs are met in the various Program Alliance work streams and provided guidance to achieve the efficient goals of the program.
- Analyzed and segmented BRs into high level and low level Use Cases.
- Coordinated the data mapping and migration of existing system’s data to the Data Warehouse.
- Designed and developed Data Models and Marts for the Business Intelligence Data Warehouse.
- Developed common sourcing requirements for BI EDW database. Worked with Sourcing Subject Matter Experts (SMEs) to document sourcing attribute requirements into structured documents.
- Uploading the results in Quality Center
- Participated in walkthrough and defect report meetings periodically.
- Documented and reported various bugs during the Testing process.
- Written SQL Queries to define identify and validate the code written for the data movement into the database tables, fine-tuned the queries for better performance.
- Retesting the resolved defects/issues after the developers fix the defects.
- Involved in preparing test data/ Test script for UAT and participated in UAT signoff
- Updated the status of the testing to the QA manager, and accomplished tasked for the assigned work to the Project Management team regularly.
Environment: Teradata v12, SAS, MAINFRAME, ENDEVOR, CA7, JCL, Rational Clear Case, Rational Clear Quest
Confidential, Chicago, IL
SQL Developer
Responsibilities:
- Involved in performing extraction, transformation and loading using DTS.
- Handled database objects with Enterprise Manager and Query Analyzer.
- Involved in Data Integration by identifying the information needs within and across functional areas of an enterprise database upgrade and scripting/data migration with SQL server 2000 Export Utility.
- Involved in developing logical and physical database using Erwin, Normalization, Dimension Modeling and Enterprise Manager.
- Optimized schema, performance, and capacity planning of various data transformation processes Related to the reports.
- Generated test data and tested database to meet the functionalities/deliverables in the project Documentation and specifications.
- Experienced on oracle supply chain.
- Involved in designing and implementing the Stored Procedures and Triggers for automating tasks.
- Proficient in debugging PL/SQL packages to troubleshoot issues.
- Worked on EJB components related to J2EE.
- Writing (Back-end) PL/SQL code to implement business rules through triggers, cursors, Procedures, functions, and packages using SQL*Plus Editor or TOAD.
- Maintained all records of project status and followed changes as per control procedures.
- Provided technical support to internal developers and external clients.
- Data transfer between Development, Testing and Production databases.
- Actively designed the database to fasten certain daily jobs, stored procedures.
- Provided support to development staff about database related issues.
- Use DDL and DML for writing triggers, stored procedures, and data manipulation.
- Scheduled database backup and performed full and differential backups of database and Transaction logs.
- Alteration of tables and fields as per requirements and recreation of indexes.
Environment: MS SQL Server 2005/2000/7.0 , Query Analyzer, EJB, Windows Server, PL/SQL, J2EE, Windows NT 2003/2000, T-SQL, Erwin, SQL*Plus, TOAD