We provide IT Staff Augmentation Services!

Data Analytic Consultant Resume

Des Moines, IA

SUMMARY

  • Overall 8 years of IT experience in Database development, ETL Development, Data Modeling, Data Visualization, Data mining, Experimental Design & Analysis, Business Intelligence and Business Objects.
  • Thorough grounding in all phases of business data analysis, including definition and analysis of questions with respect to available data and resources, overview of data and assessment of data quality, selection of appropriate models and statistical tests and presentation of results
  • Extensive experience in Text Analytics, developing SAS Macros, Data Mining solutions to various business problems and generating data visualizations using Tableau
  • Proficiency and demonstrated experience in the following technologies:
  • Knowledge on Big Data technologies: SAS, Hadoop, and Spark
  • Sound knowledge of Relational Databases (RDBMS), SQL (SSRA,SSIS and SSAS) & No - SQL databases
  • Proficiency in Business Intelligence tools (SQL, SAS and Tableau) to regularly examine customer behavior and identify opportunities to increase new customer acquisition and engagement
  • Knowledge in Python IDE, Jupyter and Panda technologies
  • Experience in SAP ECC and Teradata technologies
  • Expertise in all aspects of Software Development Life Cycle (SDLC) from requirement analysis, Design, Development Coding, Testing, Implementation and Maintenance including Agile and waterfall models
  • Experience in conducting Joint Application Development (JAD) sessions with end-users, Subject Matter Experts (SME's), Development and QA teams
  • Excellent Business writing skills in writing Business requirements document BRD, Use Case Specifications, Functional Requirement Specifications FRS, Software Design Specification SDS, Software Requirements Specification (SRS), Software External Specifications SES, Software Requirement Document (SRD), Business Continuity Plan (BCP), Workflows and Unified Modeling Language (UML) diagrams such as Activity diagrams, Class diagrams and Sequence diagrams.
  • Involved in configuring cron-jobs for ETL Loads, developing Data Marts and creation of test data and Automation of test cases for system and analyzing bug tracking reports
  • Proficient in performing UAT, regression, quality, backbone testing
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using ETL tools like Informatica and Data Stage

TECHNICAL SKILLS

  • ETL
  • SSIS, Informatica
  • Data warehouse
  • SAP, Teradata
  • Programming languages
  • Python, Jupyter, R, Pandas
  • Reporting/Statistical
  • SAS, Tableau, SSRS
  • Database Languages
  • SQL, NoSQL, MongoDB
  • Data Models
  • SSAS, Dimensional cubes for OLAP and OLTP, Hadoop, Spark
  • Agile
  • Rally, HPQC

PROFESSIONAL EXPERIENCE

Data Analytic Consultant

Confidential - Des Moines, IA

Responsibilities:

  • Performed Data science methodology in identifying the issue, Preparing Data from different DBs using data models and Analytics in SAS Enterprise Guide, Analyzed the data using Python and SAS, Visualized the data through Tableau and presented the data stories
  • Analyzed Customer Remediation issues to understand and identify customers who are harmed.
  • Critically reviewed remediation analytic approaches to ensure accurate data populations.
  • Profile data and perform quality checks to identify data outliers, issues, and inaccuracies.
  • Credibly challenge remediation data analytics to ensure reasonable, complete, accurate, and consistent requirements, strategies, data logic, and implementation.
  • Analyzed the large datasets using Python scripting and Pandas web platform
  • Developed complex SQL queries and SAS Macros for querying data against different data bases for data verification process.
  • Developed and reviewed Test Cases /Test Scripts to verify the data completeness and transformation rules for ETL tool.
  • Performed SAS/SQL code review for the impacted population using tools like SAS Enterprise Guide, SQL Developer, Teradata SQL, Toad and Microsoft SSMS.
  • Extracting and Analysing the data from Teradata warehouse using Teradata sql
  • Developed the SQL Script, check the business requirements coverage, exclusions and other applicable criteria in script.
  • Performed positive and negative testing and ensured impacted data is accurate.
  • Business intelligence report testing on Business Objects by writing the SQLs on Data marts and comparing with reports.
  • Developed the Web Services involved and testing the SOA and Automated Data Comparison and Web Service testing using Selenium WebDriver and C# libraries.
  • Analyzed Review and Executed Selenium Automation Test Scripts.
  • Gathered and document the requirements from the business and develop the user stories
  • Tested Database related performance before each release, application performance immediate after customer support fixes.
  • Understand the solution architecture of application and share the knowledge with QA team.
  • Co-ordinated with Remediation Analyst to get clarification related to supporting documents, remediation requests, data sources and code review related issues.
  • Follow Confidential standard business process to review remediation for different Line of Businesses (LoBs).
  • Go through various artifacts such as RMAT, RCAT, Data Request form, Analytic Approach, Population Waterfall and Business Requirements documents and validate the requirements against each of the provided artifacts.
  • Access various data sources and analyze the data.
  • Perform black box and white box testing in all phases of SDLC project

Environment: AWS, Teradata Sql, TOAD, SQL, ETL, Data Lake, Python, Jupyter, Pandas, JIRA, GITHUB, Hadoop, SAS Enterprise

Sr. Data Analyst

Confidential

Responsibilities:

  • Work closely with the business and technical project manager to understand the business requirement and translate into technical specs.
  • Implemented Data integration and Data governance in the AWS environment
  • Analyzed the data using SQL and Python. Transformed the data using Informatica power center.
  • Developed DML and DDL in Toad environment and integrated to main database using informatica power center
  • Scheduled and performed the informatica jobs through Unix server
  • Provide analysis reports and estimations.
  • Designed, developed, installed, tested and maintained data integrations from a variety of formats including files, database extracts and external APIs into data stores (including Snowflake, S3, etc) using ETL tools, techniques and programming languages like Python, Spark, SQL, etc.
  • Extracted the relational and nonrelational data from redshift and applied transformations using Informatica
  • Developed the denormalized data models for MongoDB database
  • Performed the loading operations to MongoDB by executing the workflows and mapped the data
  • Created flexible data models, tune queries and ETL components.
  • Researched possible customization for tuning, cost optimization, performance enhancements, data reliability and quality.
  • Ensure that all solutions meet the business/company requirements for solution data reliability, quality and disaster recovery.
  • Own the application/data end-to-end from requirements to post production working closely with other teams. Provide engineering leadership by actively advocating best practices and standards for software engineering.
  • Collaborated with other team members such as data architects, data scientists etc.
  • Consistently contributed into the project management practices using Agile method.
  • Present the prototype to the stakeholders and leadership.
  • Build the dashboards with the data stories in Tableau. Provide input to project timelines, task development, and implementation planning around business requirement and testing needs.

Environment: AWS, Informatica, TOAD, SQL, ETL, Data Lake, MongoDB,Python, JIRA, GITHUB, Hadoop,

Data Analyst

Confidential

Responsibilities:

  • Implemented for Confidential ’ payroll and related software applications which include SAP and Cyborg.
  • Analyzed the big data using SQL and Python.
  • Build the dashboards with the data stories in Tableau. Provide input to project timelines, task development, and implementation planning around business requirement and testing needs.
  • Integrating and analyzing time data from SAP HANA to AUTO TA, CATTS and Workbrain timekeeping systems to SAP HCM.
  • Worked on complex big data sets with focus on collecting, parsing, managing, analyzing and visualizing large data sets to turn information into insights using multiple platforms
  • Responsible for developing and testing SQL databases and writing applications to interface with SQL databases, as well as writing and testing code.
  • Performed the ETL (Extract, Transform and Load)operations of data into a data ware house/date mart and Business Intelligence (BI) tools
  • Worked in System Analysis, E-R/Dimensional Data Modeling, Database Design and implementing RDBMS specific features
  • Interacted with the ETL team, developer (s), management, and account holders to get the requirements, document them, design templates, and write specifications for data mapping
  • Architected Extract, Transformation and Load processes (ETL) to populate an operational data store from various sources including other databases like Teradata, SQL, spreadsheets, and flat files.
  • Implemented Agile Methodology for building an internal application and produced Adhoc reports and standard reports.
  • Configured new datasets and tested new change requests for Union payroll, benefits and timekeeping projects to integrate the changes into SAP business system.
  • Extracted different types of file formats from JD Edwards travel database and loaded the data to SAP Cloud database with the help of ETL validator
  • Designed APIs to load the data and map the data to HANA cloud platform from the relational data base system and monitored the replication activities
  • Performed data quality analysis on the JD Edwards database and extracted data to staging area of ETL tool to clean and aggregate the data as per the business requirements
  • Develops and executes test scripts for new and existing functionality
  • Take regular database backup and ensures issues are identified, tracked, reported on and resolved/escalated in a timely manner
  • Effectively applies Company methodology and enforces project standards
  • Creating mapping documents to map source fields to target fields and maintain transformations for fields.
  • Collaborating with data science team on implementing machine learning algorithms to facilitate audience
  • Worked with technical team to improve report performance and performed system testing.
  • Performed tasks like test cases, identifying defects, mockups, quality reporting, data cleansing and data quality.
  • Performed functional testing, integration testing, automation testing, and User Acceptance Testing.

Environment: SAP, Oracle, SQL, ETL, Data Lake, Python, Rally, SharePoint, Hadoop, SQL Server, Teradata

Confidential

Data Analyst

Responsibilities:

  • Designed and implemented business logic ETL processes using SQL Server Integration Services (SSIS) Services 2008R2/2014 that extracted, transformed, and validated data written to operational data store (ODS) and data marts
  • Analyzed various sources of data and provided recommendations to management Also design and support table structures for new data import
  • Scheduled jobs and monitored automated ETL processes
  • Configured SSIS packages to use environment variable
  • Managed, monitored and debugged ETL processes in both SSIS. Informatica and Dell Boomi Atmosphere
  • Migrated ETL processes from Dell Boomi Atmosphere to SSIS
  • SQL Server Database administration, database backup and restore, server and database audit
  • Employed primary, foreign, unique keys, defaults and check constraints in creating tables
  • Migrated on premise SQL Server to SQL Server in Azure Virtual Machines (IaaS) and SQL Server
  • Designed, developed, tested, and maintained Tableau functional reports based on user requirements.
  • Mastered the ability to design and deploy rich Graphic visualizations using Tableau and Converted existing Business objects reports into tableau dashboards
  • Wrote SQL conjointly with arrays and string formatting to structure data with dynamic nested headers
  • Worked closely with Business use Performed Exploratory Data Analysis and Data Visualizations using R and Tableau.
  • Mastered the ability to design and deploy rich Graphic visualizations using Tableau and Converted existing Business objects reports into tableau dashboards
  • Worked closely with Business users. Interacted with ETL developers, Project Managers, and members of the QA teams. worked through the complete lifecycle of Application Development and Data Warehouse and BI implementation starting from Corporate Data and Reporting requirement gathering, source system and data analysis, design and development of ETL processes, Stage database, Dimensional Data Warehouse, Reporting and OLAP.
  • Worked through the complete lifecycle of Application Development and Data Warehouse and BI implementation starting from Corporate Data and Reporting requirement gathering, source system and data analysis, design and development of ETL processes, Stage database, Dimensional Data Warehouse, Reporting and OLAP.

Environment: SQL, ETL, HPQC, SharePoint, Boomi, ERWIN

Business Analyst

Confidential

Responsibilities:

  • Liaison between Business Users and Technical team in analysis & design.
  • Identified opportunities for improving business processes through information systems.
  • Conducted JAD sessions with stakeholders, end users and SME, QA Analysts to determine the critical business processes and developed key functional and technical requirements.
  • Used SQL queries (select, insert, update, delete, subqueries, joins, etc.) to manipulate the data in the database and generate reports to help the business users for their daily activities.
  • Worked on a Data mining project to create a remote central data warehouse for data from multiple applications to use for reporting purposes.
  • Strong ability to use Excel - processing large quantities of data, using pivot tables, using conditional formatting, and applying business rules.
  • Planned, organized and conducted walkthrough with Business Units and Technical team for common understanding across teams of requirements and sign off.
  • Maintained all the stories and tasks in Jira which was used to know the status of requirements.
  • Implemented JIRA Safe methodologies in the project.
  • Work in waterfall and ASAP methodology of software development
  • Attend meeting with client development and support manager
  • Access production support ticket tool and assign ticket to ABAP hot seat and offshore team
  • Worked with Data governance, Data quality, data lineage, Data architect to design various models and processes.
  • Implemented end-to-end systems for Data Analytics, Data Automation and integrated with custom visualization tools using Informatica, Tableau and business objects
  • Designed, developed, tested, and maintained Tableau functional reports based on user requirements.
  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
  • Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
  • Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.
  • Developed several reusable transformations that were used in other mappings.
  • Prepared Technical Design documents and Test cases
  • Involved in Unit Testing and Resolution of various Bottlenecks came across

Environment: SQL, SAP, Informatica

Hire Now