Sr. Data Analyst Resume
Boston, MA
SUMMARY
- Around 7+ years of IT experience in Data Analysis, System Analysis, Data Extraction, Data Validation, Data Collection, Dataware housing environments.
- Good in Implementing the Data Warehousing Concepts and Business Intelligence Solutions using Unica, SQL Server, Oracle, MongoDB, DB2, MySQL, SAS, Teradata and Tableau, Informatica, Business Objects.
- Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and TeradataV2R6/R12/R13.
- Experience in T - SQL (DDL, DML), Joins, indexes, views, storedprocedures and database triggers to implement security and business logic.
- Working with SSIS packages to extract, Transform, Load data into the data warehouse from various database such as MSSQL Server/Oracle/ Sybase databases.
- Having good experience in managing Hadoop clusters using Cloudera Manager.
- Excellent understanding and knowledge of job workflow scheduling and locking tools/services like Oozie and Zookeeper.
- Worked on Classic and Yarn distributions of Hadoop like the ApacheHadoop 2.0.0, ClouderaCDH4 and CDH5.
- Experience in database development using SQL and PL/SQL and experience working on databases like Oracle, Netezza and SQL Server.
- Have solid Data Warehousing experience in design and development of Kimball Data warehouses.
- Extensive experience in analyzing and requirements gathering and writing system functional specifications including use cases with high-level experience in Base SAS, SAS/SQL, SAS Fraud Framework, SISS and SPSS Analyst.
- Expertise in SAS - PROC SUMMARY, PROC FREQ, PROC SORT, PROC SQL & advanced DATA step operations.
- Experience in SAS/BASE, SAS/MACRO, and SAS/ODS, SAS/SQL, SAS/STAT, SAS/GRAPH, SAS/ACCESS, and SAS/CONNECT in UNIX and Windows environments.
- Experience in Dimensional Data Modeling, Star/Snowflake schema, FACT& Dimension tables.
- Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch, Data quality concepts.
- Strong Problem Analysis & Resolution skills and ability to work in Multi-Platform Environments like Windows and UNIX.
- Experience in UNIX shell scripting and configuring cron-jobs for Informatica sessions scheduling.
- Automated and scheduled the Informatica jobs using UNIXShellScripting.
- Expert knowledge in UNIXshells scripting (Korn shell/Bourne shell).
- End to end experience in designing and deploying data visualizations using Tableau.
- Highly proficient in handling performance issues related to Tableau reports.
- Excellent knowledge on creating reports on SAP Business Objects, Webi reports, Tableau, TIBCOSpotfire for multiple data providers.
- Excellent knowledge in preparing required project documentation and tracking and reporting regularly on the status of projects to all project stakeholders.
- In-depth knowledge onTableauDesktop,TableauReader andTableauServer.
- Proficient inTableaudata visualization tool to analyze and obtain insights into large datasets, create visually compelling and actionable interactive reports and dashboards.
- Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
TECHNICAL SKILLS
Databases: Oracle 11g, 10g, 9i, SQL Server 2012/2008R2, DB2, MongoDB and MySQL.
Languages: C++, JAVA, SQL, PL/SQL, TSQL, SQL* Plus, HTML, XML, ASP, VB, UNIX, JAVASCRIPT, PHP VBA.
Analytics & Reporting: SSAS, Statistica, Google Analytics, Tableau Public, SSRS, QlikView.
ETL: Informatics, Datastage, SQL Servers Data Transformation Services, Visual Studio 2012/2010.
Design Tools: ERWIN modeler.
Big data Technologies: Apache Hadoop (Pig, Hive, HBase, SQOOP, HDFS, Zookeeper).
Schedulers: WLM, Informatica scheduler, Autopsy’s.
Operating System: Windows 8/7/XP, UNIX, Linus.
PROFESSIONAL EXPERIENCE
Confidential, Boston, MA
Sr. Data Analyst
Responsibilities:
- Participated in requirement gathering session with business users and sponsors to understand and document the business requirements as well as the goals of the project.
- Strong experience with databases like Oracle, Data Stage,Teradata, MS SQL Server 2012, DB2 and MS Access.
- Worked on BI data models and delivered BI content, analyzed BIreporting requirements, developed detailed reporting specifications.
- Gathered and translated business requirements into detailed, technical specifications, new features, and enhancements to existing technical business functionality.
- Resolved the revolving issues by conducting and participating in JAD sessions with the users, modelers, and developers.
- Used SQL Loader extensively to extract data from flat files and loading into the new application.
- Performed database analysis, design, coding, testing and implementation of database structures.
- Maintaining UNIX scripts for monitoring and backup.
- Worked on Golden Gate, leading tool for data replication.
- Database managing and monitoring using Oracle Enterprise Manager (OEM).
- Involved in Database migration across the different OS versions.
- Planning, Scheduling and implementing database changes while ensuring no impact to users.
- Applied Oracle patches for DST, upgrading and migration of databases.
- Cloned databases using Customshell script with hot and cold backups of source database.
- Developed the StarSchema for the proposed warehouse model to meet the requirements.
- Enforced mandatory capture metadata during the logical phase design.
- Worked with DataModeler, DBA for creating backups and restores all DB2 databases using IBM DB2 QMF.
- Created and reviewed the conceptual model for the EDW (Enterprise Data Warehouse) with business users.
- Created Hive based reports for data validation and pipelines testing.
- Created HiveUDFs to do complex calculations in the Hive queries.
- Deployed and monitored MapReduce jobs using Oozie.
- Design and produce client reports using Excel, Access, Tableauand SAS.
- Designed and built SAS programs to analyze and generate files, tables, listings, graphs and documentation.
- Executed ETL operations using SAS/Base and Enterprise Guide.
- Developed recurring SAS generated reports and shared dashboards that support front and back officeoperations.
- Worked extensively on building and scheduling on Control-M, analytic applications using SAS related products.
- In depth analyses ofdatareport was prepared weekly, biweekly, monthly using MSExcel, SQL & UNIX.
- Written several shell scripts using UNIXKorn shell for file transfers, error logging, data archiving, checking the log files and cleanup process.
- Fulfilled ad-hoc requests for operational, Supported IT projects by providing SME support related to reporting and analytics.
- Generated parameterized reports, sub reports, tabular reports using SSRS 2012.
- Deployed generated reports onto the ReportServer to access it through browser.
- Developed parameter and dimension based reports, drill-down reports, crosstabreports, charts, and Tabular reports usingTableauDesktop.
Environment: Teradata V2R6, Teradata SQL Assistant, BTEQ, SQL Server 2012/2008, SSIS, (BIDS), SSRS, Visual Studio 2010, PL/SQL,SAS/BASE, SAS/MACRO, Tableau, MS Excel,Teradata 12.0 Agile, Data Stage,UNIX Shell Scripting, XML, MVS, Windows 7/Vista/XP,UNIX.
Confidential, Cleveland, OH
Data Analyst
Responsibilities:
- Involved in Data mapping specifications to create and execute detailed system test plans.
- Worked with various RDBMS like Oracle, SQLServer, DB2 tools including QMF and Teradata with expertise in extracting, transforming and loading (ETL) data from spreadsheets, database tables and other sources using Microsoft Data Transformation Service (DTS), MDM and Informatica.
- Key role in optimizing database performance by analyzing database objects, generating statistics, creating indexes, creating materialized views, Oracle 10g AWR, ADDM, ASMM etc.
- Checking of Alert log and Trace files to monitor the database on a daily basis.
- Patched Oracle DatabaseServer9i and 10g using Opatch, apply security patches (CPU/PSU).
- Tuned memory System Global Areas like Shared pool area, Data Buffer Cache, redo buffers, I/O, Contention, modified parameter file and rebuild indexes according to the requirement for performance.
- Utilized recovery methods like Point In Time Recovery (PITR) and Table Space point in time recovery (TSPITR) using RMAN.
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Responsible for different Data mapping activities from Source systems to EDW, ODS& data marts.
- Create and maintain Hive warehouse for Hiveanalysis.
- Run various Hive queries on the data dumps and generate aggregated datasets for downstream systems for further analysis.
- Use Apache Sqoop to dump the data user data into the HDFS on a weekly basis.
- Generate test cases for the new MR jobs.
- Integrated Business Objects with SAS reporting solution to provide elaborate reporting for Investmentmanagement company platforms.
- Worked extensively on presenting SAS generated data on macro enabled Excel pivots to provide Daily,Weekly and Monthly reporting needs.
- Created the test environment for Staging area, loading the Staging area with data from multiple sources.
- Used SQL for Querying the database in UNIX environment
- Involved with Design and Development team to implement the requirements.
- Design and development of ETL processes using Informatica ETL tool for dimension and fact file creation.
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Developed SAS processes for business users to extracts summarized data and distribute business summaryreports through automated email.
- Used SQL and PROC SQL for ad-hoc report programming and creation of the reports.
Environment: SQL Server Reporting Services (SSRS), SSIS, Visual Studio 2010, SAS BIDS, T-SQL, PL/SQL, MDX, SAS/DI Studio, ORACLE 11g/10g, MS SQL Server 2012, Windows 7/XP, UNIX, Linux.
Confidential, San Antonio, TX
Responsibilities:
- Data distributed across different platforms mainframe, Oracle, SQL Server, DataStage, MongoDB Databases in different UNIX servers.
- Created PL/SQL module which was used to integrate the existing data from third parties on to the existing database.
- Developed JCL mainframe jobs, Datastage Parallel jobs for initial conversion & daily data feed to vendor system.
- Created ETL jobs to receive & load daily, weekly data from vendor system to internal tables and re-distribute to Regulatory bodies.
- Run batch jobs for loading database tables from flat files using SQLLoader.
- Created File-transfer interfaces using B2B with clearing corporation DTCC, vendor (Scivantage) application & external broker dealers to transfer Costbasis when Asset transfer happens between Client & other broker dealers.
- Responsible to work with the senior data architect to ensure that the strategy and process are compatible with the schema of target database as well as compatible with formats needed for loading into other CTPanalytictools, such as SAS etc.
- The work involves overall maintenance of Oracle 10g database, performance tuning, backup, recoveries and other database related activities.
- Installation and configuration of a 2 nodeRAC.
- Responsible for working as a member of the big data team for implementation of the big data platform Hadoop in Windows and UNIX environment.
- Working knowledge on Hadoop ecosystem and its different components for data analysis primarily from SAS Software and others.
- Work closely with Data Architects, during the migration of data from Teradata to Hadoop to identify gaps using Hive.
- Prepared SASPrograms/Macros for Safety Tables, Listings and Graphs (TLG), occasionally for Efficacy Tables.
- Prepared SAS Programs/Macros for Validating Data.
- Develop UNIX Shell scripts for automating, creating batch files to perform job scheduling and execute the utilities.
- Evaluate the data quality using developed PL/SQL stored procedures and UNIX shell scripts.
- Handled offshore resources and provided training on using the existing application, handling production issues and creating Ad-hoc reports using MS SQL ServerReporting Services.
- Created Batch processes using Fast Load, BTEQ, Unix Shell and Teradata SQL to transfer, cleanup and summarize data.
- Develop analyticreporting and dashboards on Tableau.
- Analyse CPU consumed by Unica queries and optimize them in Teradata.
- Automate Campaign monitoring using Tableau Reports.
Environment: SAS/Macros, SAS/SQL, SAS Enterprise Guide, SAS/ACCESS, SAS/STAT, predictive Modeling, MS SQL Server 2008, Hadoop (HIVE), Oracle, MVS, MS Visio Windows 7/XP, LINUX, UNIX.
Confidential
Crystal Report Developer
Responsibilities:
- Designed numerous Crystal Reports including cross tab reports, drill-down reports, summary reports and parameterized reports from tables, views and stored procedures according to the business requirements.
- Used SQL, T-SQL to develop Views, Stored Procedures and Functions which in turn were used to create reports.
- Modified existing database objects to accommodate newly identified attributes as per the KPIs.
- Created Data Connections, Data Foundations, Business Elements and Business Views using Business View Manager.
- Developed various complex crystal reports including Drill-downreports, Conditional reports, Cross-tabs reports, Parameterizedreports, Summary reports, Charts and Sub reports using Business Views, dynamic prompts, command objects, running totals, stored procedures and views.
- Published Crystal reports to Business Objects Enterprise using Publishing Wizard.
- Created dashboards with Crystal Xcelsius using gauges, alerts, sliders, selectors and charts to present complex data in an easy to visualize format.
- Resolved Fan Traps, Chasm Traps and loops by setting cardinalities manually, using context and alias.
- Created canned/ad-hoc reports using Desktop Intelligence (DeskI) and Web Intelligence (WebI) Reports from multiple data providers through Infoview.
- Converted existing DeskIreports to WebI reports using Report Conversion Tool.
- Developed detailed documentation for report users describing how to schedule, run and view Crystalreports using Infoview.
- Used SQL for Querying the database in UNIX environment.
- Involved in performance testing of all the reports, which helped in analyzing and improving report response time before moving into production.
- Created complex Business Objects WebI reports using features likeSections, Slice/Dice, Breaks, Alerts, Prompts, filters, drill down reports by using Multiple Data Providers.
- Developed detailed documentation to guide users on how to use the new features that are available inCrystalReports 10.0
Environment: Test Director 7.x, Load Runner 7.0, SQL Server 2008, SSRS, T-SQL, Oracle 10g, Windows 7/XP, UNIX AIX 5.2, PERL, Shell Scripting.
Confidential
Business Analyst
Responsibilities:
- Performed Gap analysis to prepare ‘As-Is’ document and ‘To-Be’ document.
- Wrote vision and scope documents.
- Worked with the valuation development team to document requirements and implement system enhancements for the Asset Mark-to-Market application.
- Integrated the Backup Status Reporting for all the databases in the enterprise to enable the web reporting using OracleSQL, PLSQL.
- With help of EXPORT/IMPORT utility of database to do table level and full database Defragmentation.
- Extensively developed UNIX Shell scripts to automate database operation and database monitoring.
- Experience in working high transacted OLTP systems.
- Conducted JAD sessions with SMEs such as front office representative, branch managers and users for gathering requirements.
- Authored Business Requirements Document BRD with project teams.
- Authored System Requirements Specification SRS documents with the help of development team.
- Diddata mining, datamapping and data modeling forinsurance.
- Worked on specifics of cross selling and understood the concept ofinsurance products.
- Involved in the Waterfall methodology in all phases of the project.
- Participated inDataAnalysis and Design with theDataanalyst.
- Extensively involved in writing Test cases and Test plans.
- Performed User AcceptanceTesting(UAT).
- Used the Import and Export facilities in SAS to exchange data between SAS and Microsoft Office environments (Excel, Access).
- Prepared SAS programs for generating Analysis Datasets and Summary Reports.
- Converted different formats of Data Files into SAS Datasets.
- Extensively used SAS procedures such as PRINT, REPORT, FREQ, MEANS, SUMMARY,TRANSPOSE and Data Null for producing ad-hoc and customized reports.
- Involved in TeradataSQL Development, Unit testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
Environment: SAS 9.1.3, SAS/BASE, SAS/SQL, SAS DI Studio, SAS/ACCESS, SAS/GRAPH, SAS/ODS, SAS EG, SAS Web Report Studio, SAS/Macro, Windows 7/XP, SQL server 2005.