We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

3.00/5 (Submit Your Rating)

Newark, NJ

SUMMARY

  • Over 8+Years of experience working as a Data analysis Finance Industry with ETL Development and Data Modeling.
  • Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Integration and Metadata Management Services.
  • Good experience in Data Modeling with expertise in creating Star & Snow - Flake Schemas, FACT and Dimensions Tables, Physical and Logical Data Modeling using Erwin.
  • Ability to collaborate with peers in both, business, and technical areas, to deliver optimal business process solutions, in line with corporate priorities.
  • Software Development Life Cycle (SDLC) experience including Requirements, Specifications Analysis/Design and Testing as per the Software Development Life Cycle.
  • Experienced in developing accurate business process requirements for Banks, Insurance and State government projects.
  • Experience in understanding Stored Procedures, Stored Functions, Database Triggers, and Packages using PL/SQL.
  • Good Data Warehousing concepts including Meta data and Data Marts.
  • Experience in conducting Joint Application Development (JAD) sessions with end-users, Subject Matter Experts (SME's), Development and QA teams.
  • Extensive experience in project management best practices, processes, & methodologies including Rational Unified Process (RUP) and SDLC
  • Collected, cleansed and provided Data Mining, Modeling and analysis of structured and unstructured data used for major business initiatives
  • Excellent communicator focused on customer service.
  • Ability to understand current business processes and implement efficient business process.
  • Expertise in defining scope of projects based on gathered Business Requirements including documentation of constraints, assumptions, business impacts & project risks.
  • Strong background in support documentation.
  • Analysis and review of Software and Business Requirement Documents.
  • Conducting requirement gathering sessions, feasibility studies and organizing the software requirements in a structured way.
  • Gathering business and Technical requirements that would best suit the needs of the technical architectural development process.
  • Team lead skills encompassing user interviews, coordination with Tech Lead, DBA's, Developers, QA/QC Analysts during the design phase.
  • Interviewing Subject Matter Experts, asking detailed questions and carefully recording the requirements in a format that can be reviewed and understood by both business and technical people.
  • Having application testing experience with working knowledge of automatic testing tools like Mercury Quick test Professional and Load Runner.
  • Experience with extensively working on Cloud platform - Salesforce.
  • Working experience of various databases Oracle, Teradata, SQL server, DB2, Netezza for data validations
  • Worked with various OLAP tools for multidimensional analysis of data. Business Objects, crystal Reports, Micro strategy etc.
  • Expertise in Test driven development & Unit testing methodologies for performing UAT, System Integration testing (SIT) and defect tracking using JIRA.
  • Experience in creating and executing Test Plan, Test Scripts, and Test Cases based on Design document and User Requirement document for testing purposes.

PROFESSIONAL EXPERIENCE

Confidential - Newark, NJ

Sr. Data Analyst

Responsibilities:

  • Met with stakeholders and users to review / manage expectations and identify opportunities to improve ROI.
  • Performed extensive data modeling to differentiate between the OLTP and Data Warehouse data models
  • Analyzing and mining business data to identify patterns and correlations among the various data points.
  • Offered excellent Cash Management Services including Account Reconcilement Services, Automated Clearing House, Balance Reporting Services, Cash Concentration Services and Sweep Accounts to large business customers
  • Documented data sources and transformation rules required populating and maintaining data warehouse content.
  • Created and Documented test cases and test data for Business Users.
  • Facilitated JAD sessions with end users, development and QA teams.
  • Created test scripts for all the test cases in PL/SQL.
  • Provided multiple demonstrations of Tableau functionalities and efficient data visualizations approaches using Tableau to the senior management at the client as part of the BRD.
  • Involved in data validation of the results in Tableau by validating the numbers against the data in the database tables by querying on the database
  • Acquire data from primary or secondary data sources like RedShift, FTP, and external or internal files; extract and analyze data to generate reports.
  • Created logical and physical data modeling with STAR and SNOWFLAKE schema techniques using Erwin in Data warehouse as well as in Data Mart.
  • Worked in database objects like tables, views, materialized views, procedures and packages using Oracle tools like Toad, PL/SQL Developer and SQL plus.
  • Programmed Meta data collection logic to notify stakeholder for the approval to ensure the data accuracy and confidence in the data quality.
  • Developed the Test plans for quality assurance based on functional requirements.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
  • Verified and maintained Data Quality, Integrity, data completeness, ETL rules, business logic.
  • Tested several UNIX Korn Shell scripts for ETL data loading.
  • Verified Informatica sessions, worklets and workflows in QA repository.
  • Extensively used SQL queries to check storage and accuracy of data in database tables.
  • Used SQL for Querying the Oracle database.
  • Performed physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, and Dimensions), Entities, Attributes, OLAP, OLTP, Cardinality, and ER Diagrams.
  • Developed control files for SQL Loader and PL/SQL programs for loading and validating the data into the Database.
  • Created Stored Procedures in both SQL Server and DB2 and involved in several DTS.
  • Created various transformation procedures by using SAS, ETL and SAS Enterprise guide.
  • Worked on Data Modeling of both Logical Design and Physical Design of Data Warehouse and data marts in Star Schema and Snow Flake Schema methodology.
  • Involved in Informatica MDM processes including batch based and real-time processing.
  • Responsible for reviewing data model, database physical design, ETL design, and Presentation layer design.

Confidential - Mount Laurel, NJ

Data Analyst

Responsibilities:

  • Worked on Requirement Analysis, Data Analysis and Gap Analysis of various source systems sitting and coming from multi systems. Responsible for BI Data Quality.
  • Conducted JAD sessions to allow different stakeholders such as editorials, designers, etc.,
  • Performed Business Process mapping for new requirements.
  • Performed Data Validation with Data profiling
  • Involved in Data Extraction from Teradata and Flat Files using sql assistant.
  • Designed reports in Access, Excel using advanced functions not limited to vlookup, pivot tables, formulas
  • Use SQL, PL/SQL to validate the Data going in to the Data Ware House
  • Creating complex data analysis queries to troubleshoot issues reported by users
  • Evaluates data mining request requirements and help develop the queries for the requests.
  • Execution flows and Loading data to Netezza Data mart through NZLOAD utility.
  • Testing the ETL data movement from Oracle Data mart to Netezza Data mart on an Incremental and full load basis.
  • Worked in Agile technology with Scrum and waterfall models.
  • Written XSLT scripts for various XML needs and used XPath, XQuery for data retrieval from XML documents.
  • Responsible for creating the Requirements Traceability Matrix.
  • Involved in Designing Star Schema, Creating Fact tables, Dimension tables and defining the relationship between them.
  • Worked alongside team working on creating and schedule big data workflows in Spark, MapReduce, Pig scripts.
  • Performed all aspects of verification, validation including functional, structural, regression, load and system testing.
  • Involved in backend testing for the front end of the application using SQL Queries in Teradata data base.
  • Wrote PL/SQL Stored Procedures, Functions, Cursors, Triggers, Packages and SQL queries using join conditions in UNIX environment.
  • Conducted UAT (User Acceptance Testing) for multiple iterations by writing Test Cases and signed off the same after approval
  • Tested and Automated SAS jobs running on a daily, weekly and monthly basis using Unix Shell Scripting.
  • Worked on a MapR Hadoop platform to implement Bigdata solutions using Hive, MapReduce, shell scripting and Pig.
  • Involved in Data from Sybase IQ to Netezza and make Informatica Code changes for the compatibility with Netezza.
  • Experienced in working with DB2, Teradata
  • Worked with business team to test the reports developed in Cognos.
  • Tested whether the reports developed in Cognos are as per company standards.
  • Tested different detail, summary reports and on demand reports using Report Studio.
  • Reported bugs and tracked defects using Quality Center 10.

Confidential - Philadelphia, PA

Data Analyst

Responsibilities:

  • Worked on designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
  • Utilized corporation developed Agile SDLC methodology. Used Scrum Work Pro and Microsoft Office software to perform required job functions.
  • Executed ETL operations for Business Intelligence reporting solutions using Excel.
  • Using Shared Containers and creating reusable components for local and shared use in the ETL process.
  • Develop and maintain sales reporting using in MS Excel queries, SQL in MS Access. Produce performance reports and implement changes for improved reporting.
  • Documented reports that were created in Business Objects were tested by running the SQL statements.
  • Experienced in writing complex SQL queries for extracting data from multiple tables.
  • Designed Excel templates, created Pivot Tables and utilized VLOOKUPs with complex formulas.
  • Provided weekly, monthly & ad hoc web analytics reports using Adobe Site Catalyst & Google Analytics.
  • Analyzed, evaluated, and compared website analytics packages to create a gap analysis and action plan for IT department's migration process from Site Catalyst (Adobe Analytics) and Google Analytics.
  • Involved in Tableau data visualization using Cross Map, Scatter Plots, Geographic Map, Pie Charts and Bar Charts, Page Trails, and Density Chart.
  • Evaluate, identify, and solve process and system issues utilizing business analysis, design best practices and recommended enterprise and Tableau solutions.
  • Substantial report development experience utilizing SQL Server Reporting Services (SSRS), Cognos Impromptu, and Microsoft Excel.
  • Writing PL/SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.
  • Utilize complex Excel functions such as pivot tables to manage large data sets and make information readable for other teams.
  • Developed SQL-based data warehouse environments and created multiple custom database applications for data archiving, analysis and reporting purposes.
  • Performed Data mapping, logical data modelling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database.
  • Worked in creating dashboard analysis using Tableau Desktop, including information of car rental records, analysis covered customers, days of rental, service region, travel type and etc.
  • Performed extensive requirement analysis including Data analysis and Gap analysis.
  • Analysed business requirements and segregated them into high level and low-level Use Cases, activity diagrams using Rational Rose according to UML methodology thus defining the Data Process Models.
  • Used extracts to better analyze the data, extracted data source was stored in Tableau Server, updated on a daily basis.
  • Designed and implemented basic SQL queries for QA Testing and Report / Data Validation.

Confidential - Lady Lake, FL

Data Analyst

Responsibilities:

  • Analyzed the client data and business terms from a data quality and integrity perspective.
  • Perform root cause analysis on smaller self-contained data analysis tasks that are related to assigned data processes.
  • Worked to ensure high levels of data consistency between diverse source systems including flat files, XML and SQL Database.
  • Involved in developing and run ad hoc data queries from multiple database types to identify system of records, data inconsistencies, and data quality issues.
  • Translated business requirements into data requirements across different systems.
  • Worked on understanding the customer needs with regards to data, documenting requirements, developing complex SQL statements to extract the data and packaging/encrypting data for delivery to customers.
  • Wrote SQL Stored Procedures and Views, and coordinate and perform in-depth testing of new and existing systems.
  • Provided support to Data Architect and Data Modeler in Designing and Implementing Databases for MDM using ERWIN Data Modeler Tool and MS Access.
  • Worked with Data Modeling team to create Logical/Physical models for Enterprise Data Warehouse.
  • Reviewed normalized/Denormalization schemas for effective and optimum performance tuning queries and data validations in OLTP and OLAP environments.
  • Identified the database tables for defining the queries for the reports using SSRS.
  • Exploited power of Teradata to solve complex business problems by data analysis on a large set of data.
  • Built Fast Load and Fast Export scripts to load data into Teradata and extract data from Teradata.
  • Familiar with using Set, Multiset, Derived, Volatile and Global Temporary tables in Teradata for larger Adhoc SQL requests.
  • Worked on data analysis using SQL, T-SQL and many other queries-based applications.
  • Used Teradata advanced techniques like OLAP functions CSUM, MAVG, MSUM MDIFF etc.
  • Developed Reports using the Teradata advanced techniques like Rank, Row number and etc.
  • Efficient in process modeling using Erwin in both forward and reverse engineering cases.
  • Experienced in conducting JAD Sessions.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Developed data mapping documents between Legacy, Production, and User Interface Systems.
  • Documented data content, data relationships and structure, and processes the data using Informatica Power Centre Metadata Exchange.
  • Transferred data objects and queries from MS Access to SQL Server.
  • Assisted ETL team to define Source to Target Mappings.
  • Worked with Data Architect in Designing the CIM Model for Master Data Management.
  • Compile and Generate Reports in a Presentable Format to the Project Team.

Confidential - Fremont CA

Data Analyst

Responsibilities:

  • Maintained the Erwin model repository and Created DDL in development environments on database objects.
  • Planned and defined system requirements to Use Case Scenario and Use Case Narrative using the UML methodologies.
  • Created Use Case Diagrams, Data Flow Diagrams, Data mapping, Data Lineage mapping, Sequence Diagrams, ODD and ER Diagrams in MS Visio.
  • Used BMC Remedy tool v8.1 for generating ticket.
  • Developed data models, data dictionaries, CRUD matrix in SAP HANA, database schemas like star schema and snowflake schema used in dimensional and multidimensional databases OLAP .
  • Involved in Logical & Physical Data Modeling. Database Schema design and modification of Triggers, Scripts, Stored Procedures in Sybase Database Servers.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements.
  • Developed and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access. Produced performance reports and implement changes for improved reporting.
  • Performed backend testing using SQL queries and analyzed the server performance on UNIX.
  • Worked on developing several detail and summary reports including line and pie charts, trend analysis reports and sub-reports according to business requirements using SQL Server Reporting Services (SSRS).
  • Loading staging tables on Teradata and further loading target tables on SQL server via DTS transformation Package.
  • Guided clients by generating a data forecast report for minimum investment in their upcoming projects using JMP.
  • Reviewed Stored Procedures for reports and wrote test queries against the source system (SQL Server-SSRS) to match the results with the actual report against the Data mart (Oracle).
  • Loaded external data into TADDM from Discovery Library Books and generated custom Cognos reports.
  • Prepared the IBM Tivoli Monitoring infrastructure for TADDM discovery.
  • Involved in developing ETL solutions using Clover ETL, R, and Python to process client data.
  • Conducted JAD sessions with management, SME, vendors, user and external agencies and other stakeholders for open and pending issues.

We'd love your feedback!