We provide IT Staff Augmentation Services!

Consultant (analytics) Resume

Atlanta, GA

SUMMARY

  • Having 7+ years of experience as a Data Analyst; proficient in design, development, testing, implementation, and support of enterprise data warehouse.
  • Experience in facilitating group discussions to elicit requirements in Joint Application Design (JAD) sessions by communicating with internal users and stakeholders, documented scope definition, and functional specifications document and Functional Requirements (FRD), Use cases and UML Diagrams.
  • Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Governance, Data Integration and Metadata Management Services.
  • Experience in all phases of software development life cycle (SDLC), including Requirement gathering and documentation, Use Case Documentation, FRD Specification, Business Case Study & Analysis, Quality Assurance & Testing.
  • Experience with standard software development project management process (Agile) with analysis and documentation techniques (use cases, business rules, process diagrams, etc.).
  • Expert in Microsoft Excel (Pivot Tables, VLOOKUP, Hookup, formulas, charts and VBA, etc.)
  • Expertise in ETL tools like SSIS, Clover ETL and reporting tools such as SSRS, Tableau.
  • Good hands - on working experience with using all kinds of SQL Server Constraints (Primary Keys, Foreign Keys, Defaults, Checks, and Unique), Writing Transact SQL (T-SQL) Queries, & Dynamic-queries.
  • Good experience in Risk analysis and Risk management.
  • Experience in process improvement and re-engineering, with an understanding of technical problems and solutions as they relate to the current and future business environment.
  • Created database objects like tables, views, sequences, synonyms, indexes using Oracle tools like SQL*Plus, SQL Developer, SSMS and Toad.
  • Exceptionally good knowledge and experience on AWS and Redshift.
  • Solid Excellent experience in creating cloud-based solutions and architecture using Amazon Web services (Amazon EC2, Amazon S3, Amazon RDS) and Microsoft Azure
  • Experience in Data cleaning, manipulation and querying using SQL (My SQL PostgreSQL, RedShift).
  • Support and update the functional master data management policies, procedures & processes.
  • Proficient in design and development of various dashboards, reports utilizing Tableau Visualizations like Dual Axis, Bar Graphs, Scatter Plots, Pie-Charts, Heat Maps, Bubble Charts, Tree Maps, Funnel Charts, Box Plots, Waterfall Charts, Geographic Visualization and other making use of actions, other local and global filters according to the end user requirement.
  • Experienced in developing Web Services with Python programming language.
  • Expertise in Creating Unified Modeling Language (UML) diagrams such as Business Process Flow diagrams, Use Case diagrams, Activity diagrams and Component diagrams, using Erwin and Microsoft Visio.
  • Responsible to Track, Document, Capture, Manage and Communicate the Requirements using Requirement Traceability Matrix (RTM) which helped in controlling numerous artifacts produced by the teams across the deliverables for a project
  • Proficient in capturing Data Requirements, Data Analysis & Data Mapping for Vendor feeds and Databases.
  • Experience in retrieving data in database tables by SQL Queries for database verification.
  • Involved in creating and maintaining Requirements Traceability Matrix and performing GAP analysis
  • Conducted User Acceptance Testing (UAT). Performing early analysis of existing systems to help define integrations.
  • Experience in Requirement Management and Test Management tools ALM (Application Life Cycle Management), JIRA.
  • Experience in Software Configuration Management / Version Control

TECHNICAL SKILLS

Requirement Management Tools: Rational Requisite Pro, HP Quality Center/ ALM, JIRA, SharePoint, Doors

Business Modeling Tools: Rational Rose, MS-Visio

Change Management Tools: Rational Clear Quest, Rational Clear Case

SDLC: Waterfall, Agile-Scrum, RUP Methodology, Spiral Method, Extreme programming

Languages: HTML, SQL, C, C++, Java, JavaScript, Python

RDBMS: MS SQL Server, Teradata, Redshift, MS Access, IBM

Business Analysis Skills: JAD Sessions, GAP Analysis, Use Case Modeling, Requirement Gathering, Business Process Analysis & Design

Other Tools: SAS Enterprise Guide, SAP ECC and Panorama

Other Software: MS-Office Suite (Word/Excel/PowerPoint),HP(ALM),MS-Project, TOAD

Tools: SQL, UML, SHARE POINT, HP Quality Center, TOAD, Oracle, SSMS, SSIS

BI Tools: Tableau, OBIEE, QlikView Amazon Redshift or Azure Data warehouse

MS Office Tools: MS Excel, MS Access, MS PowerPoint 2003/07

PROFESSIONAL EXPERIENCE

Confidential - Atlanta, GA

Consultant (Analytics)

Responsibilities:

  • Developed inventory policies and established accountability for tactical and strategic results through effective reporting, monitoring and cross-functional engagement.
  • Work directly with clients to analyze their data.
  • Created data specification for client for data validation.
  • Designed and led optimization projects and operations including information, data and records architecture, and data governance.
  • Wrote complex SQL queries using complex joins, grouping, aggregation, nested sub queries, etc.
  • Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements.
  • Developed MapReduce/Spark Python modules for machine learning & predictive analytics in Hadoop on AWS. Implemented a Python-based distributed random forest via Python streaming.
  • Managed storage in AWS using S3, created volumes and configured snapshots.
  • Maintained version control of SQL, Python using GitHub
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in map reduce way.
  • Analyzed business processes using Python and SQL for monitoring regulatory compliance.
  • Validated already developed python reports. Fixed the identified bugs and re-deployed the same.
  • Perform day-to-day decisions for the overall landscape for master data objects and the application of business rules
  • Ensure master data integrity in key systems as well as maintaining the processes to support the data quality.
  • Work closely with the business/IT to ensure alignment of master data rules and designed process for data collection and creation of central repository for data integrity.
  • Transforming staging area data into a STAR schema (hosted on Amazon RedShift) which was then used for developing embedded Tableau dashboards
  • Worked on AWS Cloud management and responsible for code build, release and configuration on Amazon EC2.
  • Created EC2 instance to deploy the images to AWS Cloud
  • Acquire data from primary or secondary data sources like Amazon S3, RedShift, FTP, and external or internal files; extract and analyze data to generate reports.
  • Used extensive MS Excel (Pivot Table, Formulas and Lookup) to analyze the data
  • Used Macros in MS Excel to automate few regular process flow
  • Gathered requirements for ETL developers and document data specific processes.
  • Customized and generated SSMS databases and report generation for evolving data analysis process.
  • Interacted with users for verifying User Requirements, managing Change Control Process, updating existing Documentation
  • Provide Master Data Management (MDM), process leadership globally to ensure all regions are following the global process and achieving data quality metrics through regular meetings.
  • Developed Tableau visualizations and dashboards using Tableau Desktop for Clients to review their data.
  • Analyzed requirements wrote User Stories and managed Agile Lifecycle in Team Foundation Server (TFS) and JIRA from planning to sprint deployment. Tracked and communicated team velocity and sprint/release progress. Assisted in managing the Product Backlog and Sprint Backlog.
  • Identified the current business rules and determined where improvements were required based
  • Fulfill responsibilities of a project manager to conduct meetings, track and report status updates for multiple components.

Environment: MS Visio, HP ALM, Data Analysis, Redshift,MS Excel, AWS,MS Word, Teradata, Clover ETL, SSMS, SSIS, Oracle Database, MS Visio, MS Project, Tableau, SQL, SharePoint, Python, MS-SQL Server, SAP ECC 6.0,Agile/Scrum, Visual Studio, TFS, HTML, Oracle, SQL,MS Office

Confidential, Dallas, TX

Data Analyst

Responsibilities:

  • References new applicant information with existing card member information to detect and prevent possible fraud
  • Responsible for timeliness and accuracy of daily billing operations activities, audits and validations.
  • Developed, documented, and reported data quality goals and standards
  • Wrote PL/SQL statement and stored procedures in Oracle for extracting as well as writing data
  • Created Rich dashboards using Tableau Dashboard and prepared user stories to create compelling dashboards to deliver actionable insights.
  • Performed Back end testing by writing SQL statements like Join, Inner Joins, Outer joins and Self Joins used TOAD and SSMS
  • Facilitated regular Joint Application Design meetings with key application and project stakeholders including program managers and divisional coordinators for multi-tiered, high-risk, multi-cadence releases
  • Triaged Production and Pre-release defects in up to 6 parallel environments utilizing versioning and tracking tools such as SharePoint and Test Director
  • Communicated system and business processes to project stakeholders via Activity, Process and Data Flow diagramming using SQL, Visio, SnagIt, FrontPage, etc.
  • Prepare daily/weekly/monthly Disaster Recovery tape backups for offsite vault storage.
  • Sort and file documents in a file vault using a numerical filing system.
  • Performed Scope, Risk and Gap Analysis using system reports, MS Access and MS Project used as input to management program decisions.
  • Performed AWS Cloud administration managing EC2 instances, S3, SES and SNS services
  • Used AWS S3, DynamoDB, AWS lambda, AWS EC2 for data storage and models' deployment.
  • Move data across Teradata, SQL server and Redshift databases.
  • Master Data Migration and Management: Master data management during a deployment process, including data design, data collection, data loads and data migration. Conducted Knowledge Transfer of Global template to facilitate data collection.
  • Maintain fraud analysis models to improve efficiency and effectiveness of company systems.
  • Communicated requirements through visual diagrams including dataflow diagram, system functional diagram to both management and development teams to ensure uniform understanding of the business scope, and detailed business and functional requirements, and collaborated with the development team to enforce the implementation of requirements throughout the entire coding cycle and managed change request.
  • Involved in all phases of the SDLC, from initiation to deployment phase; performed change management to update the BRD/ SRS when necessary.
  • Developed monitoring and notification tools using Python.
  • Create rules for Data Cleaning and transformation, find data anomalies and remediate data issues.
  • Coordinate Master Data upload for Test Cycle periods, Simulation Test as well as the main productive data and collection of transactional data during cutover period.
  • Change management agent to master data related issues in line with set SLA and workflows.
  • Production support, troubleshooting and resolution.
  • Maintained and modified routine complex reports using SAP ECC/CRM/Web Intelligence/Business Objects/Finance, IBM DB2, Microsoft Access systems and databases.

Environment: HP ALM, SharePoint, MS-SQL Server, Python, redshift,MS Visio, Tableau, Teradata, Toad, MS Project, Jira, Python, MS Excel, SQL, UML Diagrams and Flow Charts, MS Access, SAP ECC, AWS,Agile/Scrum, Visual Studio, TFS, HTML, Oracle, SQL, MS Office

Confidential

Data Analyst

Responsibilities:

  • Work with users to identify the most appropriate source of record and profile the data required for sales and service.
  • Document the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
  • Involved in defining the business/transformation rules applied for sales and service data.
  • Define the list codes and code conversions between the source systems and the data mart.
  • Worked with internal architects and, assisting in the development of current and target state data architectures
  • Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines
  • Involved in defining the source to target data mappings, business rules, business and data definitions
  • Responsible for defining the key identifiers for each mapping/interface
  • Responsible for defining the functional requirement documents for each source to target interface.
  • Document, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.
  • Involved in configuration management in the process of creating and maintaining an up-to-date record of all the components of the development efforts in coding and designing schemas
  • Developed the financing reporting requirements by analyzing the existing business objects reports
  • Interact with computer systems end-users and project business sponsors to determine, document, and obtain signoff on business requirements.
  • Utilized Informatica toolset (Informatica Data Explorer, and Informatica Data Quality) to analyze legacy data for data profiling.
  • Responsible in maintaining the Enterprise Metadata Library with any changes or updates
  • Document data quality and traceability documents for each source interface
  • Establish standards of procedures.
  • Generate weekly and monthly asset inventory reports.
  • Evaluated data profiling, cleansing, integration and extraction tools(e.g. Informatica)
  • Coordinate with the business users in providing appropriate, effective and efficient way to design the new reporting needs based on the user with the existing functionality

Environment: MS Visio, UML, Quality Center, MS Excel, UML, SQL, MS Visio, MS Project, MS Office, HP ALM, Metadata, Reports

Hire Now