We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

2.00/5 (Submit Your Rating)

Mt Laurel, NJ

SUMMARY

  • 7+ years of IT experience in Data Analysis, Design, Development, Maintenance and Documentation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server applications and Web applications on UNIX and Windows platforms.
  • Experienced with Data Warehouse in the development and execution of data conversion, data cleaning and extractions, data governance, data profiling and standardization strategies and plans as several small tables are combined into one single data repository system MDM (Master Data Management).
  • Strong working experience in various SDLC methodologies such as RUP, Waterfall, and Agile - Scrum.
  • Extensive experience with Data Warehousing, Extraction, Transformation and Loading (ETL) and Business Intelligence (BI) tools.
  • Strong experience in Data Profiling, Data Migration, Data Integration and Metadata Management Services.
  • Experience working closely with key Stakeholders, Subject Matter Experts (SME's), Application Owners and Data owners and Cross-Functional Teams to get the Data Lineage (Source System Details, Server Details, Table, Column, Views, Web Services, Integration methods etc.).
  • Hands on experience in SQL Server Management Studio (SSMS), SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS) and SQL Server Analysis Services (SSAS).
  • Proficient in checking the APIs, if returning data in XML format is not as defined in the API documentation.
  • Experience in troubleshooting test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
  • Experience in writing the Oracle Sql Queries and procedures based on the requirement.
  • Experienced in using all pre-defined SQL Functions for Data analysis and Manual testing.
  • Experienced in analyzing Metadata from various Data Tables and Databases and utilize them for ETL process and Data Quality Process.
  • Experienced in choosing right data types for the metadata based on the source data and helps in maintaining consistency between source and target systems.
  • Experience in analyzing data with SQL coding in native database environments like Teradata/oracle to perform data profiling/data discovery which gains the ability to understand the data.
  • Experienced in Python to manipulate data for data loading and extraction.
  • Experience in automating recurring reports using SQL and Python.
  • Excellent understanding of the ETL tools like AB Initio, Erwin and Informatica.
  • Proficient in Power Platform using Power Apps, Power Automate and PowerBI.
  • Experience developing Custom BI Reports.
  • Experience executing SQL queries and documented them as part of validating the Business Objectreports and for testing purposes.
  • Ability to understand Data lineages, Hierarchies and Relationships in RDBMS applications.
  • Experience providing multiple solutions based on analytical tools such as Tableau, MS Power BI, statistical programming languages (Python, R, SQL, SAS).
  • Strong work ethics, highly motivated team player with good communication, presentation, and interpersonal skills and able to manage and work in multi- cultural workplace.

TECHNICAL SKILLS

Database Tools: SQL Server Management, SQL Loader, SQL Developer, TOAD

Programming Languages: SQL, PL/SQL, HTML, UNIX shell Scripting, Python, R

Reporting Tools: Tableau, Power BI, Cognos, Report Builder Reporting Services (SSRS)

ETL Tools: Erwin, Informatica, SSIS

BI Tools: Tableau, Crystal Reports, SAS, Business Object, Power BI

Databases: Oracle, SQL Server, DB2, Amazon Cloud (AWS)

Operating Systems: Windows /XP, UNIX, Linux

RDBMS: SQL, Oracle, and MS Access

Utilities/Application: MS Office Excel, Outlook, PowerPoint, MS Project, MSAccess

PROFESSIONAL EXPERIENCE

Confidential - Mt Laurel, NJ

Sr. Data Analyst

Responsibilities:

  • Acquire, clean and structure data from multiple sources and maintain databases/data systems.
  • Identify, analyze, and interpret trends or patterns in complex datasets.
  • Filter and “clean” data, and review computer reports, prints, and performance indicators to detect and correct code problems.
  • Worked on massive structured, unstructured, transactional, and real-time datasets from a variety of sources to analyze customer usage patterns and provide actionable, impactful, intuitive insights using statistics, metrics, and algorithms.
  • Performed data analysis from all the source systems and target system and documented same as NAV (Net asset value), Amalgamation process, relationship between HR and NAV, etc. to report any outage or data discrepancy.
  • Performed data analysis to monitor daily KPI performance using Adobe Analytics, investigated the changes in key metrics, coordinated with designers and developers to make data informed changes in web application and improve Verizon’s digital user experience.
  • Analyzed the data related to IT infrastructure operations and developed a sustainable data intake channel using Alteryx and Power BI.
  • Worked closely with designers and developers to formulate data supporting digital strategies, developed test plans for optimization and performed post optimization analysis (A/B testing).
  • Responsible for technical documentation and post-production support.
  • Worked on Data mapping, logical data modeling used SQL queries to filter data within the database tables.
  • Provided application support and involved in day-to-day support activities including creation of support Incidents, change requests and attending approval meetings for high priority problem resolutions.
  • Developed status reports and pivot tables for leadership.
  • Created overall application health analysis metrics reports for upper-management teams, covering multiple applications that reported on overall availability, performance, and end-to-end transaction flow.
  • Developed SAS macros for Ad-hoc reporting in SAS Enterprise guide using query builder and SQL.
  • Troubleshoot ETL failures and performed manual loads using SQL stored procedures.
  • Created SQL, PL/SQL, SQL Loader control files, functions, procedures, and UNIX Shell scripts.
  • Worked with large volumes of data; extracted and manipulated large datasets using standard tools such as Python, R, and SQL.
  • Collaborated cross functionally with data science team and other teams including back-end developers, product managers etc. to help define problems, collect data and build analytical models.
  • Maintained production application support on data warehouse and OLTP side.
  • Created & maintained various ETL process utilizing SQL Loader &Informaticatool.
  • Evaluated data profiling, cleansing, integration, and extraction tools (e.g., Informatica).
  • Involved in writing complex SQL queries to check the data integrity.
  • Performed data visualizations using Tableau.
  • Involved in defining the Source to Target data mappings, Business rules, data definitions.
  • Participate in development and deployment of cloud-based systems AWS.

Environment: Python, R, SQL, SAS, PostgreSQL, ETL, Tableau, Power BI, Adobe Analytics, AWS

Confidential - Union, NJ

Data Analyst

Responsibilities:

  • Acquire, clean and structure data from multiple sources and maintain databases/data systems.
  • Identify, analyze, and interpret trends or patterns in complex datasets.
  • Filter and “clean” data, and review computer reports, prints, and performance indicators to detect and correct code problems.
  • Worked on massive structured, unstructured, transactional, and real-time datasets from a variety of sources to analyze customer usage patterns and provide actionable, impactful, intuitive insights using statistics, metrics and algorithms.
  • Performed data analysis from all the source systems and target system and documented same as NAV (Net asset value), Amalgamation process, relationship between HR and NAV, etc. to report any outage or data discrepancy.
  • Worked on Data mapping, logical data modeling used SQL queries to filter data within the database tables.
  • Provided application support and involved in day-to-day support activities including creation of support Incidents, change requests and attending approval meetings for high priority problem resolutions.
  • Made use of Statistical Modeling in Forecasting/ Predictive Analytics, Segmentation methodologies, Regression based models, Hypothesis testing, Factor analysis/ PCA, Ensembles.
  • Created Interactive Tableau Dashboard after gathering and analyzing the data from the warehouse to illustrate the metrics of the business process.
  • Developed status reports and pivot tables for leadership.
  • Created overall application health analysis metrics reports for upper-management teams, covering multiple applications that reported on overall availability, performance, and end-to-end transaction flow.
  • Managed data collection of web customer behaviors and satisfaction through Adobe Analytics.
  • Developed SAS macros for Ad-hoc reporting in SAS Enterprise guide using query builder and SQL.
  • Involved from the start to end with data science tools and techniques, including data manipulation using SQL.
  • Took ownership of analytical projects end to end from extracting and exploring data, tracking feature usage of product using Adobe Analytics, and present it to product managers.
  • Troubleshoot ETL failures and performed manual loads using SQL stored procedures.
  • Created SQL, PL/SQL, SQL Loader control files, functions, procedures, and UNIX Shell scripts.
  • Worked with large volumes of data; extracted and manipulated large datasets using standard tools such as Python, R, and SQL.
  • Collaborated cross functionally with data science team and other teams including back-end developers, product managers etc. to help define problems, collect data and build analytical models.
  • Design, develop, implement, and maintain scheduled analytics, reporting and scorecards using Excel and PowerBI.
  • Designed simple yet robust algorithms using rule-based or optimization-based linear programming approaches to optimize data.
  • Identified data patterns, provided metrics, diagnosed problems, and provided intelligence for business operations using Tableau.
  • Involved in publishing of various kinds of live, interactive data visualizations, dashboards, reports, and workbooks from Tableau Desktop to Tableau servers.
  • Created & maintained various ETL process utilizing SQL Loader &Informaticatool.
  • Evaluated data profiling, cleansing, integration and extraction using Informatica.
  • Fixed the errors which came up on data migration from one database to another.
  • Analyzed, understood error scenarios and came up with scripts to fix them in bulk in minimal time span.
  • Wrote queries for error fixing and involved the complex database structures for various products.
  • Involved in writing complex SQL queries to check the data integrity.
  • Involved in defining the Source to Target data mappings, Business rules, data definitions.
  • Participated in development and deployment of cloud-based systems AWS.
  • Developed and maintained dashboards using Tableau Desktop and publish it to Tableau Server to meet business specific challenges.
  • Automated Regular AWS tasks like snapshots creation using Python scripts.
  • Deployed and monitored scalable infrastructure on Amazon Web Services (AWS) & configuration management using AWS Chef.
  • Responsible for technical documentation and post-production support.

Environment: Python, R, SQL, Adobe Analytics, PostgreSQL, ETL, Tableau, Power BI, Unix, AWS, DB2.

Confidential - Roanoke, VA

Data Analyst/ Reports Analyst

Responsibilities:

  • Worked on daily basis with lead Data Warehouse developers to evaluate impact on current implementation, redesign of all ETL logic.
  • Work with users to identify the most appropriate source of record and profile the data required for sales and service.
  • Participated in user meetings, gathered Business requirements & specifications for the Data-warehouse design. Translated the user inputs into ETL design docs.
  • Document the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
  • Wrote T-SQL statements for retrieval of data and involved in performance tuning of T-SQL queries and Stored Procedures.
  • Involved in defining the business/transformation rules applied for ICP data.
  • Define the list codes and code conversions between the source systems and the data mart.
  • Reviewed the data model and reporting requirements for Cognos Reports with the Data warehouse/ETL and Reporting team.
  • Worked with internal architects and assisteds in the development of current and target state data architectures.
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
  • Involved in defining the source to target data mappings, business rules, business and data definitions.
  • Document, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.
  • Involved in configuration management in the process of creating and maintaining an up-to-date record of all the components of the development efforts in coding and designing schemas.
  • Interact with computer systems end-users and project business sponsors to determine, document, and obtain signoff on business requirements.
  • Compiled analytics and reports of issues for both development and business teams using Jira and Quantum Metrics.
  • Prepared dashboard reports in Tableau to visualize the monthly activities of the application support which drove the business to take decisions regarding their continuous improvement.
  • Developed Tableau workbooks from multiple data sources using Data Blending.
  • Utilized Informatica toolset (Informatica Data Explorer, and Informatica Data Quality) to analyze legacy data for data profiling.
  • Pull the data from warehouse and sort according to data elements.
  • Responsible in maintaining the Enterprise Metadata Library with any changes or updates.
  • Document data quality and traceability documents for each source interface.
  • Generate weekly and monthly asset inventory reports.
  • Extract data from warehouse with help of TOAD, SQL and SQL Server.
  • Evaluated data profiling, cleansing, integration, and extraction tools (e.g., Informatica)
  • Coordinate with the business users in providing appropriate, effective, and efficient way to design the new reporting needs based on the user with the existing functionality.

Environment: Java, Web API, Toad, SQL, PL/SQL, T-SQL, ETL, MS Office, HP QC/ALM, JIRA, Quantum Metrics.

Confidential - Philadelphia, PA

BI Data Analyst

Responsibilities:

  • Worked on CSG ACSR (Advanced Customer Service Representative) which is a web portal application used for CSG Billing Application through which all the activities of Billing could be performed like creating a location, creating a customer and Order Entry functionalities like Installing a service, cancelling a service, restart, etc.
  • Interacted with the stakeholders to get a better understanding of client business processes and gather business requirements.
  • Identified Use cases from the requirements and wrote Use Case Specifications.
  • Provided support of ad-hoc JIRA requests and special projects as assigned directly or through agile.
  • Interactions with leads and other business users to communicate and clarify the results and business needs.
  • Assisted in development of the specifications by recommending alternative solutions to technical problems.
  • Data manipulations including the usage of standardSASprocedures and use ofSASMacros.
  • Worked in bringing large data (i.e., millions of records) fromSASand non-SAS(Teradata and/or Oracle) data sources into aSASenvironment and performingSASdata manipulations against the resulting sets.
  • Worked in transforming the resultingSASdatasets into specific text file formats and subsequently being able to manipulate the resulting text files with UNIX.
  • Worked with business partners to identify data reporting, systems, or processing improvements.
  • Tested the above in DEV, QA and UAT region and released in Prod region with the means of IT Service Management (ITSM).
  • Worked on testing, debugging, and supporting documentation (functional and technical) of ad-hoc and special business request for CSG Billing Application.
  • Contributed to continuous improvement and develop self-competencies.
  • Worked towards consistent improvement in developing process.
  • Provided production support for the web portal application and supported theSASremediation project.

Environment: Rational RequisitePro, Rational Clear Quest, SQL Server, Teradata, SAS, Oracle, UML, UNIX, MS Office, MS Project, MS Word, MS Excel, MS Visio

We'd love your feedback!