We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

5.00/5 (Submit Your Rating)

Jacksonville, FL

PROFESSIONAL SUMMARY:

  • Over 9+ Years of professional IT experience in Data Analyst with solid understanding of business requirements gathering, business process flow, business process modeling and business analysis.
  • Experience in providing reporting analytical thinking, process improvements, problem solving, time management, decision making and issue resolution.
  • Strong working knowledge of Oracle 11g/10g/9i/8i, SQL, PL/SQL (procedures, packages, functions, database triggers), Teradata.
  • Experience in creating DDL, DML and Transaction queries in SQL.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Expert in Data Modeling, Data Analysis, Data Visualization and Modern Data Warehouse concepts. Designed Various Reports/dashboards to provide insights and data visualization using BI/ETL tools like Mainframes SAS, SSAS, SSIS, OLTP, OLAP, Business Objects and Tableau.
  • Experience in working with BODS using with different data sources (Flat files, Oracle, SAP ECC, Microsoft SQL Server).
  • Good understanding of Oracle Data Dictionary, Oracle Workflow, Data Flow Diagrams, ER Diagrams, Data warehousing concepts, RDBMS and Normalization Techniques.
  • Experience in Data Analysis, Data Migration, Transformation, Integration, Data.
  • Solid experience in Data warehousing best practices working with Metadata, repositories and experience within a disciplined lifecycle methodology.
  • Designed ETL Specification documents to load data from source to target using various transformations according to the business logic
  • Extensive experience in project management best practices, processes, & methodologies, including Rational Unified Process (RUP) and SDLC.
  • Good experience in working with very large databases and performance tuning.
  • Proficiency in database design concepts, architecture and ETL design strategies.
  • Preparing Dashboards using calculations, parameters, calculated fields, groups, sets and hierarchies in Tableau.
  • Designed, developed, and deployed highly complex BI solutions via Cognos 8 Report Studio, Analysis Studio and Transformer.
  • Experienced in working with OLTP database, OLAP database and data warehouses.
  • Expertise in SCRUM Agile Framework and Methodology with experience in Daily Stand - up Meeting, Sprint Planning meeting and Sprint Retrospective meeting.
  • Proficient in conducting JAD sessions with management, SME’s, vendors, users and other stakeholders.
  • Familiar with current industry standards, such as HIPAA, ISO, Six Sigma, and Capability Maturity Model (CMM).
  • Knowledge of code sets such as revenue codes, procedure codes (CPT4, HCPCS and ICD9/10), Diagnosis Related Grouping (DRG) codes and place of service codes (POS), Government Programs Knowledge: Medicare Medicaid programs CHIP.
  • Constantly research and evaluate new technologies to provide more efficient and cost effective solutions for disaster recovery and backup for mission critical systems.
  • Acted as a liaison between the technology and business areas of Organizations.
  • Experience in interpreting and applying data modeling, feasibility studies, System Requirements Specification (SRS), Scope Document, Applications Architecture and Request for Proposal (RFP).
  • Experience in conducting Joint Application Development (JAD) sessions with end-users, Subject Matter Experts (SME' s), Development and QA teams.
  • Experience providing data, application, and technology consulting in pre-feasibility and feasibility discussions with IT team members and business partners, defining strategy level architectures (domain target architectures) and system/technology level architecture.
  • Experience in conducting Unit test, IST testing and UAT.
  • Experienced in Performing User Acceptance Testing (UAT) for data verification.
  • Good understanding in Banking and Health care industry business processes.
  • Team player as well as able to work independently with minimum supervision, innovative and efficient, good in debugging and strong desire to keep pace with the latest technologies.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 8x/9x, IBM Web Sphere Data Stage and Quality Stage.

Reporting Tools: Cognos, Business Objects.

Databases: Oracle 11g/10g, MySQL, MS SQL Server, Teradata.

Defect Tracking Tools: HP Quality Center, JIRA

Languages/Web: SQL, PL/SQL, SQL*Plus, HTML, ASP, Java, Shell Scripting and MS-Excel, XML,HTML, Unix Korn Shell Scripting.

GUI Tools: SQL*Loader, TOAD, Data Loader Tool.

Modeling Tool: Erwin, Visio

Environment: HPUX, AIX 4.5, Solaris 2.x, MS Windows, Windows NT.

PROFESSIONAL EXPERIENCE:

Confidential, Jacksonville, FL

Sr. Data Analyst

Responsibilities:

  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
  • Involved in data governance, defined data mapping from source to target (for all the three heritages)
  • Gathered metadata and created data analysis repository (spreadsheet), which includes following metadata information: data source, hierarchy, fact, dimension, data attribute, data integration, data model, data acquisition etc.
  • Work with subject matter experts, Finance & HR data analysts & other data modelers to create data modeling deliverables.
  • Write complex SQL queries against the target Oracle database to generate reports and compare against legacy Access database reports
  • Used various transformations like Filter, Expression, Aggregator, Sequence Generator, Source qualifier, Update Strategy, Joiner, Normalizer, Router, XML generator, XML Source qualifier, Connected Look up, Unconnected lookup, Stored Procedure and Union to develop mappings in the Informatica Designer.
  • Involved in the Data Mining, Profiling and Data Modelling.
  • Involved in testing the Cognos reports by writing complex SQL queries
  • Tested Cognos reports to check whether they are generated as per the company standards
  • Extensively involved with data quality and data governance solutions including platforms and supporting data processes
  • Defined data requirements and elements used in XML transactions.
  • Performed Data Validation with Data profiling
  • Involved in the testing of the Data Mart using Informatica Power Center.
  • Identified and Documented additional data cleansing needs and consistent error patterns that could diverted by modifying ETL code.
  • Developed mappings for Slowly Changing Dimensions of Type1, Type2, Facts and Summary tables using all kinds of transformations.
  • Extensively used Informatica Power Center for Extraction, Transformation and Loading process.
  • Extensively tested several ETL Mappings developed using Informatica.
  • Extensively used Teradata load utilities Fast load, Multiload and Fast Export to extract, transform and load the Teradata data warehouse
  • Worked in an agile technology with Scrum.
  • Worked at conceptual/logical/physical data model level using Erwin according to the requirements.
  • Queried Teradata Database and validated the data using SQL Assistant.
  • Used import and export facilities of the application to download/upload XMLs of failed test cases so as to re-verify.
  • Scheduled the jobs using Auto sys and automated the jobs to be run at specific time and automated the reports.
  • Writing UNIX scripts to perform certain tasks and assisting developers with problems and SQL optimization.
  • Wrote complex T- SQL, SQL queries using joins, sub queries and correlated sub queries
  • Performed Unit testing and System Integration testing by developing and documenting test cases in Quality Center.
  • Designed & developed various Ad hoc reports for different teams in Business (Teradata and Oracle SQL, MS ACCESS, MS EXCEL)
  • Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
  • Involved in extensive DATA validation using SQL queries and back-end testing.
  • Validated cube and query data from the reporting system back to the source system.
  • Tested analytical reports using Analysis Studio.

Environment: Informatica 9.x, Cognos 8.0 Series, Flat files, MS SQL Server 2012, Oracle 11g, SQL, PL/SQL, TOAD, IBM DB2, AGILE, Teradata 13, Teradata SQL Assistant, ERWIN, HP ALM, HP Quality Center 10, Autosys, XML, Toad, Unix Shell Scripting.

Confidential, Philadelphia, PA

Sr. Data Analyst

Responsibilities:

  • Work with a diverse IT community to identify project data requirements and detailed data requirements need to source data for the project.
  • Designed and developed conceptual, logical and physical data models.
  • Designed ETL architecture to Process large no of files and created High-level design, low-level design document.
  • Possess expertise with relational database concepts, stored procedures, functions, triggers and scalability analysis.
  • Designed and developed weekly, monthly reports related to the marketing and financial departments using Teradata SQL.
  • Working with clients to better understand their needs and present solutions using structured SDLC approach.
  • Worked on loading of data from flat files/Table sources to Staging using Teradata MLOAD, FLOAD Utilities.
  • Wrote T-SQL statements for retrieval of data and Involved in performance tuning of TSQL queries and Stored Procedures.
  • Extensively used JIRA for executing the test cases, defect tracking and test management.
  • Built reusable Mapplets using Informatica Power Center Designer.
  • Worked in Various Stages of Data warehousing life cycle development database logical and physical Design, ETL process, performance tuning on Windows NT, UNIX operating systems on SMP and MPP machines.
  • Used Informatica Workflow Manager to create sessions, workflows to run the logic embedded in the Mappings.
  • Involved in GAP analysis of source systems and target systems.
  • Designed ETL Specification documents to load data from source to target using various transformations according to the business logic.
  • Created data flow diagrams and process flow diagrams for various load components like FTP Load, SQL Loader Load, ETL process and various other processes that required transformation.
  • Optimized the SQL and PL/SQL queries by using different tuning techniques like using hints, parallel processing, and optimization rules.
  • Written Unix Shell Scripts to automate loading files into the database using Crontab.
  • Created Column Store indexes on dimension and fact tables in the OLTP database to enhance read operation.
  • Extensively developed PL/SQL Stored Procedures, Functions, Triggers and Packages.
  • Documented the existing mappings as per the design standards followed in the project.
  • Worked with ETL developers with data mapping specifications.
  • Developed complex mappings to load the data from source systems (Oracle) and Mappings.
  • Wrote PL/SQL statement and stored procedures in Oracle for extracting as well as writing data.
  • Carry out Defect Analysis and fixing of bugs raised by the Users.
  • Highly motivated with the ability to work effectively in teams as well as independently.

Environment: Informatica Power Center 9.x/8.6,SQL, PL/SQL, Oracle 10g/11g, Erwin,Teradata, MS Word, Visio, PowerPoint, MS Excel, UNIX Shell Scripting, Business Objects, JIRA, Load Runner, MS Access OLAP.

Confidential, Omaha, NE

Sr. Data Analyst

Responsibilities:

  • Participated in meetings with stakeholders, business managers, users to understand and document business requirements and functional specifications for the Workflow.
  • Worked on data mapping for various source systems.
  • Designed ETL architecture to Process large no of files and created High-level design, low-level design Documents.
  • Used Agile Central for Poker planning giving story point estimation.
  • Involved in Data Mining, Data Profiling and Data Modeling.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Designed & created OLAP Cubes with Star schema using SSAS.
  • Meeting with user groups to analyze requirements and proposed changes in design and specifications.
  • Worked with VBA Macros with Excel VBA, which automate tasks in Excel by writing macros.
  • Extensively worked on Shell scripts for running SAS Programs in batch mode on UNIX.
  • Develop Oracle PL/SQL triggers and procedures, set up Oracle PL/SQL package to analyze the tables and indexes, record stables and rebuild indexes.
  • Developed SSIS packages, architecting the flow of data from various sources such as MS SQL Server 2008, Oracle and MS Excel to target (MS SQL Server 2012).
  • Used JIRA for proprietary issue tracking purpose.
  • Created data flow diagrams and process flow diagrams for various load components like FTP Load, SQL Loader Load, ETL process and various other processes that required transformation.
  • Worked extensively on data visualizations by designing, developing and deploying of Tableau dashboards on Tableau server.
  • Wrote hundreds of DDL scripts to create tables and views in the company Data Warehouse.
  • Designed and developed the star schema Data mod Fact Tables to load the Data into Data Warehouse.
  • Responsible for SQL tuning and optimization using Analyze, Explain Plan and optimizer hints.
  • Strong knowledge of database environments like Oracle, DB2 and Sybase.
  • Responsible to design, develop and test the software (Informatica, PL SQL, UNIX shell scripts) to maintain the data marts (Load data, Analyze using OLAP tools).
  • Conduct task status, functionality review, scrum meetings and adjust estimated time per completing a task.
  • Worked on building load rules and transformation logic for (ETL) of the data into the data warehouse.
  • Experience in building Data Integration, Workflow Solutions and Extract, Transform, and Load (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS).
  • Worked on creating the test cases and UAT process
  • Created SQL, PL/SQL, SQL Loader control files, functions, procedures, and UNIX Shell scripts.
  • Created the XML control files to upload the data into Data warehousing system.

Environment: Informatica power center Designer 9.x/8.x, Tableau 9.3,10,ETL,SSIS,RUP,UNIX Shell Scripting, Oracle 11g, PL/SQL, SQL Loader, Agile scrum framework, Teradata, R14, MS Excel,DB2 database.

Confidential, Wilmington, DE

SQL Data Analyst

Responsibilities:

  • Gathered the business requirements by conducting a series of meetings with business users.
  • Created logical data model from the conceptual model and it's conversion into the physical database design.
  • Tuned the performance of the queries by working intensively over the indexes.
  • Extensively used Star Schema methodologies in building and designing the logical data model into Dimensional Models.
  • Used Sybase Power Designer tool for relational database and dimensional data warehouse designs.
  • Created reports using SQL Server Reporting Services (SSIS).
  • Involved in Data mapping specifications to create and execute detailed system test plans.
  • The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Used SQL for querying the database in the UNIX environment.
  • Responsible for creating mapping documents required for the ETL team.
  • Created SQL, PL/SQL, SQL Loader control files, functions, procedures, and UNIX Shell scripts.
  • Develop and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access.
  • Responsible to design, develop and test the software (Informatica, PL SQL, UNIX shell scripts) to maintain the data marts (Load data, analyze using OLAP tools).
  • Experience on Extraction Transformation and Loading (ETL) process using SSIS
  • Conduct data mapping sessions, and write the required business requirements documentation including use cases.
  • Created entity-relationship diagrams, functional decomposition diagrams and data flow diagrams.
  • Used Star Schema and Snow Flake Schema for data marts / Data Warehouse.
  • Prepared the Joins/Filter logic documents that would help the ETL design team perform the Joins on the tables that are in the form of flat files before loading them to FDS or any downstream systems.
  • Created documentation and test cases, worked with users for new module enhancements and testing.
  • Used SQL Server 2005 tools like Management Studio, Query Editor, Business Intelligence Development Studio (BIDS) including SSIS and SSRS.
  • Created job schedules to automate the ETL process.
  • Normalized the database up to 3NF to put them into the star schema of the data warehouse.
  • Wrote PL/SQL statement and stored procedures in Oracle for extracting as well as writing data
  • Collected business requirements to set rules for proper data transfer from Data Source to Data Target in Data Mapping.
  • Loading staging tables on Teradata and further loading target tables on SQL server via DTS transformation Package.
  • Reduced labour intensive manual processes like job entry, approval and invoicing by exposing well-defined SOA interface allowing B2B integration with Financial Clients
  • Worked on building load rules and transformation logic for Extraction, Transformation and Loading (ETL) of the data into the data warehouse.

Environment: Sybase, SOA, Sybase Power Designer, SSIS, DTS, Rational Rose, Oracle 9i, ETL, Teradata, Oracle Designer, PL/SQL, XSD, SSIS, DOS and UNIX shell scripting, Sequential Files

Confidential, Washington, DC

Jr. Data Analyst

Responsibilities:

  • Involved in various stages like development, system analysis and design.
  • Created levels of Data Flow Diagram using Microsoft Office Visio.
  • Designed and developed Entity-relationship Diagram and Data Dictionary up-to- date after any alterations were made to the metadata and data in each table.
  • Created SQL Agent jobs to run SSIS packages.
  • Developed SSRS Reports like Drill through Reports, Drilldown Reports, linked reports and parameterized reports
  • Data discovery, Aggregation and Transformation based on business use cases and requirement.
  • Configured Server for sending automatic mails to the respective people when a SSIS process failure or success.
  • Created Data-storage migration space to move physical data block from one to another disk for visualization purpose.
  • Extracted data from client/server databases to transfer and load tables on the report databases using SSIS package SQL 2008 software.
  • Created multiple automated reports and dashboards sourced from data warehouse using Tableau/SSRS.
  • Developed story telling dashboards in Tableau Desktop and published them on to Tableau Server, which allowed end users to understand the data on the fly with the usage of quick filters for on demand needed information.
  • Analyzed database requirements from the users in terms of loading dimensions and fact tables using SSIS Packages.
  • Physical Migration of the Developed reports onto the Development SSRS server.
  • Developed UNIX Shell scripts to automate various periodically repetitive database processes.

Environment: MSSQL Server2012, MS Excel, T-SQL, SSIS, SSRS, SQL *LOADER, SQL*PLUS,

We'd love your feedback!