We provide IT Staff Augmentation Services!

Data Analyst Resume

5.00/5 (Submit Your Rating)

CA

SUMMARY:

  • Around 7+ years of IT experience in Data Analysis, System Analysis, Data Extraction, Data Validation, Data Collection, Dataware housing environments.
  • Good in Implementing the Data Warehousing Concepts and Business Intelligence Solutions using Unica, SQL Server, Oracle, MongoDB, DB2, MySQL, SAS, Teradata and Tableau, Informatica, Business Objects.
  • Good knowledge of Relational Database Management Systems (RDBMS).
  • Extensive knowledge in MS Access Databases, MS Excel, MS Word, MS PowerPoint.
  • Experience on Oracle database administration and also involved in database migration from oracle 11g/10g to DB2 V10.1.
  • Work experience in Data Analytics involving Data Mining, design, and development of SAS - driven solutions in Investment Banking domains focusing on trend analysis and predictive modeling.
  • Hands-on-Experience in using Analytical tools like SAS, R, RStudio, and Python.
  • Used variety of data sources from of Big Data platforms to mine and analyze massive amounts of data for identifying consumer trends and patterns and brining real-time insights, created dashboards for management.
  • Experience with designing and developing data pipelines to load production data into HDFS and creating analytical datamarts using PIG, Hive for reporting.
  • Used Hadoop, Hive, PIG along with SAS and R analytical software for creating summarized data marts for statistical modeling and customer segments.
  • Expertise in SAS - PROC SUMMARY, PROC FREQ, PROC SORT, PROC SQL & advanced DATA step operations.
  • Analyzed phase III Clinical Trails through SAS programming by getting statistical support from the Statisticians and by using different SAS/STAT procedures and ODS Traces.
  • Worked in moving an existing SAS web based application from SUN Solaris based SAS8.2 toLINUX SAS Grid.
  • Excellent capacity planning skills in UNIX AIX/Solaris environment.
  • Strong knowledge in Performance Tuning and Capacity Planning and evaluating products in UNIX area
  • Researched and regularly monitored email/web data quality issues by applying statistical analysis with SQL and SAS STAT and with tools such as Tableau.
  • Experienced in scheduling server and parallel jobs using DataStage Director, UNIX scripts.
  • Experience in publishing Dashboards to Tableau Server.
  • Designed and deployed reports with Drill Down, Drill Through and Drop down menu option and Parameterized and Linked reports using Tableau.
  • Experience in designing stunning visualizations using tableau software and publishing and presenting dashboards on web and desktop platforms.
  • Good knowledge of Crystal Reports Development.
  • Proficient inTableaudata visualization tool to analyze and obtain insights into large datasets, create visually compelling and actionable interactive Reports and dashboards.
  • End to end experience in designing and deploying data visualizations using Tableau.
  • Highly proficient in handling performance issues related to Tableau reports.

TECHNICAL SKILLS:

Databases: Oracle 11g, 10g, 9i, SQL Server 2012/2008R2, IBM DB2 V10.1, MongoDB and MySQL.

Languages: C++, JAVA, SQL, PL/SQL, TSQL, SQL* Plus, HTML, XML, ASP, VB, UNIX, JAVASCRIPT, PHP VBA.

Analytics & Reporting: SSAS, Statistica, Google Analytics, Tableau Public, SSRS, QlikView.

ETL: Informatics, Datastage, SQL Servers Data Transformation Services, Visual Studio 2012/2010.

Design Tools: ERWIN modeler.

Big data Technologies: Apache Hadoop (Pig, Hive, HBase, HDFS).

Schedulers: WLM, Informatica scheduler, Autopsy’s.

Operating System: Windows 8/7/XP, UNIX, Linus.

PROFESSIONAL EXPERIENCE

Confidential, CA

Data Analyst

Responsibilities:

  • Optimizing the complex queries fordataretrievalfrom huge databases.
  • Involved in different database Modules like SQL Server, DB2, ORACLE ERP, Teradata and SAS.
  • Programmed mainframe application programs in COBOL, CICS, and Cincom's Supra relational database language.
  • Created, supported and executed batch JCL using IBM and BMC DB2 Utility tools to perform Image Copy, Reorg, Runstat, and Recover Unload and Load utilities.
  • Maintained and scheduled the daily/weekly/monthly batch DB2 utility Job maintenance JCL (Copy, Reorg, Runstat, etc.) using IBM's DB2 Automation tool, BMC DASD Manager and CA/Platinum Database Analyzer.
  • Defined and altered DB2 database objects executing DDL to support the client application project teams and business applicationdevelopers.
  • Performed Database Administrator (DBA) support for the Cincom's Supra relational database to analyze, develop and maintain the database tables, and provided 24x7 on-call DBA support.
  • Done thedataanalysis, using pivot tables, formulas (v-lookup and others), datavalidation, conditional formatting, and graph and chart manipulation.
  • Root cause analysis ofdatadiscrepancies between different business system looking at Business rules,datamodeland provide the analysis to development/bug fix team.
  • Perform Data mining and Data analysis activities using Hadoop Hive, SQL Server, DB2, SAS and Excel to explore big data that will identify and drive new strategies
  • Calculated conditional probabilities of various problems identified at houses of high volume customer care calls.
  • Involved in transferring files from various OLTP servers to Hadoop file system.
  • Developed Hive, Pig Latin and Impala queries to load and process data in Hadoop File system.
  • Importing and Exporting Data using SQOOP from MySQL/Oracle to HiveQL.
  • Generating reports to visualize the data using Tableau reporting tool.
  • Imported and converted Text-Based Data files into SAS Data Sets for analysis.
  • Generate reports using SAS/MACRO, SAS/ODS, Proc Report, Proc Print, Proc Summary, Proc freq, Proc, means, Proc tabulate and Proc SQL.
  • Generated a finance report using SAS and SQL for a Cox business team to validate the performance of cohort customers.
  • Involved in the development of Korn-shell scripts for executing SAS batch files on UNIX
  • Schedule the ad-hoc procedures of the failed jobs using UNIX resource and Crontab.
  • Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet in Tableau Desktop.
  • Created Prompts, customized Calculations, Conditions and Filter (Local, Global) for various analytical reports and dashboards.
  • Combined views and reports into interactive dashboards in Tableau Desktop.

Environment: SAS Enterprise Guide, SAS 9.2/9.3, SAS/Access, SAS/Base, SAS/Macro, SAS/Stat, SAS/ Graph, SAS/SQL, Teradata, IBM DB2 V10.1, EXCEL, Oracle, JavaScript, VBA, SSRS, Tableau, MS Office and UNIX.

Confidential, Houston, TX

Data Analyst

Responsibilities:.

  • Excellent skills in various databases like SQL Server, DB2, EDW, Hadoop ANA DB and Microsoft Access.
  • Conducted sessions to resolve critical issues.
  • Experience on Oracle database administration and also involved in database migration from oracle 10g to DB2 V10.1.
  • Database activities such as Installation, Patching, Implementing Backup and Recovery Strategy, User administration, Table space Management on Standalone DB.
  • Created the business process model using MS Visio for better understanding of the system and presented it to Project Manager and other team members for validation.
  • Performed Business Process Mapping and performed AS IS and TO BE analysis.
  • Involved in creating automated Test Scripts representing various Transactions, Documenting the Load Testing Process and Methodology.
  • Created meaningful reports for analysis and integrated the Performance Testing in the SDLC.
  • Developed and managed Project Plans and Schedules. Managed resolution of Project issues and conflicts.
  • Analyze ETL business procedures and problems to refine data for database usage in Oracle11g environment utilizing tools including TOAD, SQL, PL/SQL, SAS, Microsoft Access, and SharePoint.
  • Conducted Functional Walkthroughs, User Acceptance Testing (UAT), and supervised the development of User Manuals for customers.
  • Generate final reporting data using Tableau for testing by connecting to the corresponding Hive tables using Hive ODBC connector.
  • Retrieving data from SQL Server using SAS EG 4.x and SAS 9.1 to Storing and managing data in SAS Datasets and Libraries.
  • Created SAS Stored Processes for reports/datasets created for Claims/UM/DM/Compliance Teams to fulfill ad-hoc reporting needs.
  • Automated SAS jobs on CRONTAB using SAS 9.1 for UNIX environment.
  • Data Visualization using TABLEAU for Reporting from Hive Tables.
  • Involved in data validation of the results in TABLEAU by validating the numbers against the data in the database tables by querying on the database.
  • Tested the final application for Usability testing to verify whether all the User Requirements were catered to by the application.
  • Performed requirement gathering for integration with the Enterprise Reporting System using SSRS.

Environment: T-SQL, MS SQL Server Integration Services 2010/2008R2, IBM DB2 V10.1, Hadoop (Hive), MS SQL Profiler, Visual Studio 2010, SSRS, Tableau, TOAD,UNIX, WINDOWS 7/XP.

Confidential, Houston, TX

Data Analyst/Data Modeler

Responsibilities:

  • Incorporated RUP to create Business Requirement Document Specifications using MS Visio and MS Word.
  • Worked in different types of data base environments like Oracle 10g/9i, MongoDB, SAS, SQL Server 2008R2/2005, DB2 and HDFS.
  • Conducted JAD Sessions, SME, vendors, users and other stakeholders for open and pending issues.
  • Performing Transition activities for Non-Mainframe DB2 Database Servers and basic administration activities of Oracle database.
  • Responsible for meetings with users and stakeholders to identify problems, resolve issues and improve the process to ensure a stable and accurate solution.
  • Partner with Business Teams/Product Team to Initiate Prioritize, Scope, and Design and Deliver new reporting applications.
  • Business release related Support, Validation Plan, Data profiling for the RTM enrolled Merchant groups & this involves BIG DATA.
  • Extracted data sets from server using PROC IMPORT and created datasets in SAS libraries.
  • Used the Import and Export facilities in SAS to exchange data between SAS and Microsoft Office environments (Excel, Access).
  • Validated data using SAS functions and procedures.
  • Developed high quality customized tables, reports and listings using PROC TABULATE, PROC SUMMARY and PROC REPORT.
  • Provided descriptive statistical analysis using tools like PROC FREQUENCY, PROC MEANS and PROC UNIVARIATE.
  • Facilitated user acceptance testing and test strategies and trained the future users to coordinate their activities.
  • Documentation of database/data warehouse structures and updated functional specification and technical design documents.

Environment: SQL Server Analysis Services (SSAS), SSIS, SAS BIDS, T-SQL, PL/SQL, MDX, SAS/DI Studio, SSRS, Tableau 7, ORACLE 11g/10g, DB2, MS SQL Server 2012, UNIX.

Confidential

Data Analyst

Responsibilities:

  • Worked according to the software development life cycle (SDLC).
  • Communicated with Clients several databases and its tools like MongoDB, SQL server, DB2 and Oracle.
  • Worked with data movement utilities IMPORT, LOAD, and EXPORT under high volume conditions.
  • Gathered requirements from remotely based business users and defined and elaborated the requirements by holding meetings with the users (who are also Confidential 's employees).
  • Worked on various complex SQL Queries, Stored Procedures, CLR Procedures, Views, Cursors and User Defined Functions to implement the business logic.
  • Analyzed the historical documentation, supporting documentation, screen prints, e-mail conversations, presented business and wrote the business requirements document and got it electronically signed off from the stakeholders.
  • Utilized simple methods like PowerPoint presentations while conducting walkthroughs with the stakeholders.
  • Developed new SAS programs and Modified existing SAS programs.
  • Performed validation of SAS-generated output (tables, listings and graphs) via independent programming. Also performed QC check for output layouts.
  • Imported data in the form of SAS datasets from flat files of various formats like pipe delimited, tab delimited, .CSV, .XPT etc.
  • Conducted GAP analysis so as to analyze the variance between the system capabilities and business requirements.
  • Involved in defining the source to target data mappings, Business rules, and Business data definitions.
  • Extensively involved in filtering, consolidation, cleansing, Integration, ETL and customization of data mart.
  • Communicated with the third party vendor to do the programming.
  • Designed and Optimized Connections, Data Extracts, Schedules for Background Tasks and Data Refreshes for corporate Tableau Server.
  • Designed and implemented near-real-time monitoring Dashboards for Tableau Server and Visualization users.
  • Created dashboards for ad-hoc reporting and analysis.
  • Created new reports based on requirements.
  • Usedtableau6asdatavisualization tool to observedatapatterns and identify anomalies and helped developdataqualitydashboardsintableau.
  • Upgraded the present application by adding new functionalities.
  • Designed and developed reports such asDrill down, Drill Through usingMicrosoft Reporting Services (SSRS)
  • Developed Reports in SSRS to send the reports directly to the respective E-mails using Subscriptions.

Environment: MS Office 2007, MS Visio 2003, MS Excel, SAS, PowerPoint, MS Project, UML, MS SQL Server, Visual Studio 2010, Erwin, Ad-Hoc Reports, Business Objects, Tableau 6, SSRS, MS Outlook, Windows 7, UNIX.

Confidential

SAS BI Warehouse Analyst

Responsibilities:

  • Worked on different types of SAS environment data bases SQL Server 2005, Oracle, and MongoDB.
  • Created and maintained required Data marts and OLAP Cubes using SAS DI Studio and SAS OLAP Cube Studio to fulfill reporting requirements.
  • Performed data analysis, created reports with extensive use of BASE SAS, SAS Macro and SAS reports with the help of default procedures.
  • Performed Custom transformations on data before loading and data validation using DI Studio.
  • Created Aggregate cubes from existing OLAP cubes for reporting purposes.
  • Built and Maintained new SAS MACRO AUTOCALL LIBRARIES in addition to existing libraries.
  • Created programs for comparison of data structures, formats and variables across the project using Compare Procedure, merge and sort techniques.
  • Good understanding ofDataWarehousingconcepts
  • As part ofdataquality developed reports and alerts identify possibledataquality issues and worked on them to resolve them.
  • Created and updated existing SAS metadata objects in the SAS metadata repository.
  • Created capacity plans and performance tuned existing ETL process.
  • Worked on edit check programs to maintain data consistency and data validation.
  • Production Job Scheduling using SAS DI Studio.

Environment: SAS 9.1.3, SAS/BASE, SAS/SQL, SAS DI Studio, SAS/Macros, SAS/GRAPH, SAS EG, SAS Web Report Studio, SAS/Macro, Windows 7/XP, Oracle, SQL server 2008/2005, MongoDB.

We'd love your feedback!