We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

2.00/5 (Submit Your Rating)

Hopkinton, MA

SUMMARY:

  • 10 years of IT experience in Data Analysis, Design, Development, Maintenance and Documentation for Data warehouse and related applications using ETL, BI tools, Client/Server and Web applications on UNIX and Windows platforms.
  • Comprehensive knowledge and understanding of healthcare IT, Procurement and portfolio management, Financial Services in Capital markets and retail Credit risk in Fraud management platforms
  • Extensive experience developing Business requirements documents, High level process flow technical design documents and efficiently liaise with SME’s to transform business requirements to technical Process modelling frameworks
  • Statistical Data Analysis using SAS/SPSS for data cleansing /profiling, Tableau Analysis and report generation
  • Experience in Data mapping with good understanding of facets in interface analysis, mapping between facet tables and legacy systems and ETL tools like AB Initio, SSIS and Informatica Power Center.
  • Experience in visual database design tools like MySQL Workbench and user interface tools like Data studio.
  • Experience in implementing and standardizing Static, Reference and dynamic data across the enterprise data management platforms with vendor data consolidation across SFDC and Bloomberg.
  • Extensive analytics using Excel's advanced features such as macros, lookup functions, array formulas, pivot tables, conditional formatting and advanced data charting.
  • Experience working with databases like Teradata, Oracle, MS access SQL Server and coordinating web/page layouts in SharePoint.
  • Experience implementing automated test scripts with HP quality center and Selenium IDE.
  • Extensive experience with Extraction, Transformation and Loading (ETL) for data cleanse and standardization and Business Intelligence (BI) tools
  • Experience on execution of correction scripts and performance monitoring of the fix with dashboard reporting tools like eclipse BIRT and Cognos BI Analysis and Report Studio .
  • Comprehensive understanding to import and integrate data across Hadoop File management systems using HiveQL.
  • Experience in methodologies like Agile Scrum Model, Waterfall Model and Creating Process mapping, Use Cases, Sequence diagrams, Activity diagrams

TECHNICAL SKILLS:

ETL TOOLS: TERADATA, IBM Data Stage, Informatica BIQ

DATABASES: MySQL, PGSql, DB2, Microsoft SQL server, Oracle 11g/10g /9i, GreenPlum, MS Access

REPORTING TOOLS: HP Quality Center, Selenium IDE& COGNOS BI, BIRT design studio, Tableau reporter

PROGRAMMING: SQL, PL/SQL, C++, UNIX Shell scripting

ANALYTICAL TOOLS: SAS &FACETS BUSINESS ANALYTICS

PROFESSIONAL EXPERIENCE:

Confidential, Hopkinton, MA

Sr. Data Analyst

Responsibilities:

  • Acts as a liaison between Portfolio Management and Configuration Team with focus on data extraction across SAP and BIQ and directing the analysis with automated systems across BIQ.
  • Provides business operational and financial analytical and statistical support, primarily Confidential the regional level within Professional Services. Responsible for profiling data, clean existing redundant data, developing regional reports, business planning preparation and reporting analysis, and tracking and forecasting of regional revenue stream.
  • Gathering requirements from stakeholders via discovery sessions, draft reviews of Business Requirement Documenting, proofing of requirements documents, and obtaining approval of document from stakeholders
  • Transformation of business requirement/rules document to technical process modelling frameworks using use cases and process flow diagrams in a high level technical design document working in conjunction with development teams and SME’s. eliciting requirements, drafting and editing technical and business documentation for quoting solution rules and Providing guidance and oversight for Systems Integration Test Cases and create bug reproduction scripts on selenium IDE.
  • Execute SQL queries to extract, manipulate, and/or calculate Costing analysis (Cost reduction, Margins, Pricing, MSRP)
  • Comprehensive knowledge of generating reports using Code Library (REPORT & TABULATE), Macros and create new styles and table layouts using SAS ODS.
  • Carry out data mapping, table design and optimization and convert data from spreadsheets and desktops into corporate data warehouses. This included creation of data models, including robust data definitions involving entity - relationship-attribute models, star, or dimensional models.
  • Used Teradata SQL assistant for managing people soft HRMS repository (view and edit), establish data lineage among HRMS data marts, execute data correction custom routines & transforms, Import and export items between different DB2 systems and manage metadata with markets to monitor data conversions.
  • Assisting Quality Assurance in Functional and End to End validation for automation of API within EMC internal procurement.
  • Providing management project updates to Integrations leadership.
  • Supporting User Acceptance Testing using selenium IDE to aid automated exploratory testing
  • Ensure process improvement and compliance in the assigned module through data segmentation and predictive modelling.
  • Prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations.

Environment: SAS, PGsql, Greenplum, Internal BIQ, MS Access, Selenium IDE, Teradata SQL Assistant, Tableau reports and user interface

Confidential, Plano, TX

Data Analyst

Responsibilities:

  • Data extraction, transformation and analysis in support of analytics projects or in response to requests from other stakeholders within Financial Markets or across EBM and Work with Data Architects, Project Management, design, Manufacturers/vendors to obtain workable schedules.
  • Perform solution Provider-vendor data Mining consolidation to ensure accurate product information, maintain vendor/solution provider blacklists, and standardize contractual as well as legal agreements and streamline Event Based process chains with financial vendor data from MARKIT and Bloomberg L.P.
  • Use SaaS-Cadis to customize the presentation of data, synchronized Meta data management and generate high quality reference data.
  • Standardize best practices across all telecom supply chain processes, Develop And monitor performance indicator reports for all Primary/Secondary Capital markets-supply chain functions using MS Excel Power pivot and SPSS
  • Spreadsheet analysis for profiling the erroneous data, descriptive and inferential statistical analysis, report generation, database access, decision support on market reference data and market price data.
  • Ability to execute marketing campaigns and analyze the appropriate data sources to complete any requests for pre-analysis in capital markets using Excel's advanced features such as macros, lookup functions, array formulas, pivot tables, conditional formatting and advanced data Charting.
  • Import and integrate data providers from financial data vendor’s infrastructure to address Ad hoc Queries to view data in terms of tables and joins from Ja Hadoop distributed file system (HDFS) using Hive HiveQL.
  • Manipulate and process large amounts of structured and unstructured data for data conversion using Business objects and format reporting using VLookup’s, Power Pivot and Power Query in MS Excel 2013
  • Create data inventories and identify opportunities for novel analytics using internal reference data, including public, government, social /digital and fundamental data.
  • Establish and manage the analytics infrastructure for master reference data and transaction data and reduce redundancy of incorrect data and reconciliation expenses.
  • Extensive experience in ETL processes in extracting data from Operational and Legacy Systems to Data Marts using SQL server MDS(Configuration manager & deployment manager).
  • Performed data profiling, Data capture, coding management and reporting data mapping for different sources using Cognos BI Analysis and Report Studio
  • OLAP, Drill down report, cross tab report, parent / child report generation using BIRT designer studio
  • Used Agile Scrum methodology to build the different phases of Software development life cycle.

Environment: Tera data SQL assistant, SAS/SPSS, Cognos BI Analysis Studio, DB2, Business objects Designer & Query, SQL, Power Pivot/Query, Oracle 10g, MS Visio, Hadoop, HiveQL, UNIX.

Confidential, DE

Data Quality Analyst

Responsibilities:

  • Creation of mapping documents, Business requirement documents and technical design documents and liaise between the analysis team and SME.
  • Involved in cleansing of data, by writing complex SQL queries and efficient requirement capture using use cases
  • Used several built in reporting tools for generating data for analysis using Cognos BI, Go Dashboards and SAS/SPSS Base Analysis studio.
  • Formatting, Report Procedure, Graphic and text-based displays and statistically-oriented procedures in SPSS packages.
  • Improved the overall execution process and optimized Data analysis in Excel 2010 using SUMIFS, SUMPRODUCT, INDEX, MATCH, VLOOKUP Functions.
  • Worked extensively in providing clean reference data for subsequent migration of front-end activity to digital channels
  • Designed mid and large sized Campaigns (Batch & CMS) for the digital capital marketing and micro segmentation teams with respect to pricing and reference data and published detailed insights across MS Share point 2010
  • Plan web parts layouts in share point and coordinate page layouts for payment management systems with HRMS payroll systems.
  • Assisted in streamlined decision making process leveraging structured and non-structured consumer insights and Primary Capital market analysis of digital payments data to enable faster processing.
  • Channelized reference data with rapidly changing market, products and underlying events and establish semantic differences in reference data.
  • Generated reports after every level of execution for Business Owner’s to review the data captured across digital platforms like SmartSpend and MyBarclays.
  • Worked as validation data analyst to validate the campaigns developed by lead segmentation analyst for people soft applications
  • Designed and developed matrix and tabular reports with drill down, drill through and drop down menu option using SSRS.
  • Used FACETS Analytics for fast and easy retrieval, display and grouping of information for performing queries and generating reports

Environment: Teradata SQL Assistant, Flat files, SQL, MS sharepoint 2010, Shell scripts, UNIX, MS Excel, SAS/SPSS

Confidential, Thousand Oaks, CA

Data Analyst

Responsibilities:

  • Reviewing the business requirements and working with SME’S team for bridging gaps and design high level functional documents describing technical details and work flow.
  • Creating and executing SQL queries to perform Data Integrity testing on an Oracle Database to validate and test data against facets Data tables.
  • Identified and tracked claim processing, claims repricing, referral management with electronic data interchange module and determined the hierarchies in facets dimension analysis.
  • Real time delivery of claims and customer service items thereby reducing bottle necks and automating work to reduce cost.
  • Testing of interfaces with facets and conversion of member data and provider data, provider pricing, plan codes from legacy system to Facets server.
  • Performed extensive range of detailed processing with facet claim id’s and maintenance applications that enable users to control managed health care functions
  • Writing test scenarios based on business requirements and Business use cases
  • Documented Test Cases corresponding to business rules and other operating conditions within selenium driver
  • Executed sessions and batches in Informatica/mainframe and tracked the log file for failed sessions

Environment: Informatica, facets 4.61, SQL, UNIX Shell Scripting, MS SQL Server 2005, TOAD, Medidata

Confidential, Dallas, TX

Data Analyst

Responsibilities:

  • Develop Use Cases to uncover behavioral and functional requirements of the data mart design and determining whether the stated requirements are clear, complete, consistent and unambiguous, and resolving any apparent conflicts.
  • Utilized shared containers for code reusability in implementing the predefined business logic and provide integrated data to the web services digital teams based on requirements
  • Created and scheduled the job sequences by checking job dependencies.
  • Wrote complex SQL queries using joins, sub queries and correlated sub queries
  • Wrote PL/SQL stored procedures, functions and packages and triggers to implement business rules into the application
  • Developed shell scripts to invoke back end SQL and PL/SQL programs
  • Performed Unit Testing to check the validity of the data Confidential each stage
  • Used DataStage Director to debug the jobs and to view the error log to check for errors
  • Implemented best practices in the development environment (code standards, code migration).

Environment: DataStage 6.0, Cognos BI, Advanced Excel, MS Access 2000Key BankTamil Nadu, India 07/2008 - 02/2010

Confidential

Data Analyst

Responsibilities:

  • End-to-end ETL development of the Data Mart. Data Quality Analysis to determine cleansing requirements
  • Responsible for the Dimensional Data Modeling and populating the business rules using mappings into the Repository for Meta Data management
  • Understand the business needs, develop design documents and implement the same into a functional database design
  • Extensively used ETL and Informatica to load data from MS SQL Server, Excel spreadsheet, flat files into the target Oracle database
  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping &Mapplet Designer and Transformation Designer
  • Loading the Data from the tables into the OLAP application and further aggregate to higher levels for analysis
  • Developed PL/SQL procedures/packages to kick off the SQL Loader control files/procedures to load the data into Oracle
  • Tuning Informatica Mappings and Sessions for optimum performance

Environment: Informatica Power Center 6.2, Informatica Power connect, OLAP, Oracle 8i, MS SQL Server 7.0/2000, PL/SQL, Erwin 3.5, Toad, Unix Shell Scripting.

Confidential

Data Warehousing Analyst

Responsibilities:

  • Creating and executing SQL queries to perform Data Integrity testing on an Oracle Database to validate and test data using TOAD.
  • Identified and tracked the slowly changing dimensions/mini dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems
  • Developed and executed various manual testing scenarios and exceptionally documented the process to perform functional testing of the application
  • Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results
  • Executed sessions and batches in Informatica and tracked the log file for failed sessions
  • Compared the actual result with expected results. Validated the data by reverse engineering methodology i.e. backward navigation from target to source
  • Extensively worked on Backend using SQL Queries to validate the data in the database
  • Responsible for creating test data for testing environment

Environment: Data Stage, DB2, UNIX, IBM Mainframe, PL/SQL

Confidential

Data Analyst

Responsibilities:

  • Identified source systems, their connectivity, related tables and fields and ensured data consistency for mapping.
  • Analyze, design, develop, implement and maintain moderate to complex initial load and Incremental load jobs to provide data to ODS.
  • Created jobs and sequences to extract allowances, benefits, brokers, claims, members, providers and employee group data with applying various business rules
  • Used data stage administrator to create environmental variables and set some project level permissions.
  • Analyzed data issues in the load and providing the solution for the issues, discussing with the data owners and creating DS jobs to correct the data
  • Developed jobs in data stage, using different stages like Transformer, Aggregator, Lookup,
  • Designed Data Stage ETL jobs for extracting data from heterogeneous source systems, transforming and finally loading into the data marts

Environment: IBM Mainframes, Informatica, COGNOS, SQL Developer, HP Quality Center

We'd love your feedback!