We provide IT Staff Augmentation Services!

Data Analyst/ Data Engineer Resume

4.00/5 (Submit Your Rating)

Richmond, VA

SUMMARY:

  • 7 years of extensive experience as a Data Analyst/Data Engineer/ System Analyst/SQL Developer/Tableau Reporting Analyst with solid understanding of evaluating data sources and strong understanding of Data Warehouse, AWS Red shift, Big data, Mainframes, ETL methodologies, Tableau 9.3,9.X, BI, OLAP, Tableau Server.
  • Prior experience with Confidential, McLean and Richmond, VA.
  • Wrote Python scripts to load/unload manipulate process and analyze data from diverse sources.
  • Experience in writing data quality checks using Python, PySpark.
  • Experience creating AWS EC2 Linux instances using Cloud Formation Templates and store flat files.
  • Installed Python and several Python modules to analyze data on AWS EC2 Instances.
  • Involved in data migration processes to migrate historical data from legacy system into AWS Redshift (to S3 as flat files and Redshift as tables) using Python.
  • Experience in using JIRA /Agile/Scrum methodology and well versed in writing user requirements.
  • Strong Knowledge and Experience in entire Software Development Life Cycle (SDLC)
  • Strong working experience in Data Analysis, Data Validation, Data Verification and identifying data mismatch.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Hands on Experience in Writing Python Scripts for Data Extract and Data Transfer.
  • Experience in developing reports using Power BI, SSRS and Tableau.
  • Strong Experience in writing Python programs for manipulating the data reading from various Teradata and converting them as CSV Files.
  • Excellent knowledge in preparing required project documentation, tracking, and reporting regularly on the status of projects to all project stakeholders.

TECHNICAL SKILLS:

Environment: /Platforms: Unix, Windows / XP, Mainframes

Languages:: SQL, UNIX, Shell Scripting, Batch Scripting, Python, SAS 8.x 9.x, SAS Enterprise Guide, SAS Information Map studio.

Databases:: Teradata, SQL Server, Snowflake, Oracle, MS Access, MySQL, Oracle10g/9i/8i/7i Statistical Packages, SAS 8.x 9.x (Base SAS, SAS/STAT, SAS/SQL, SAS/MACROS, SAS/GRAPH, SAS/ACCESS, SAS/ODS), SAS Enterprise Guide, SAS Information Map studio.

Tools: SQL Assistant, BTEQ, Snowflake, AWS, Redshift

BI Tools: Tableau Desktop, Tableau Server, Power BI.

GUI: MS Office Suite

PROFESSIONAL EXPERIENCE:

Confidential, Richmond, VA

Data Analyst/ Data Engineer

Responsibilities:

  • Responsible to make the design, development & administration of CML Confidential Data Quality Initiative.
  • Responsible for Writing the Data Quality checks, based on the existing source code, using Python & PySpark dataframe work in Databricks platform (which Improved process time)
  • Populated data quality failures to snowflake tables for data quality assertions for regulatory reporting or modeling activities.
  • Responsible for matching the results with UAT test results.
  • Involved in data migration processes to migrate data from CDA into One Lake (to snowflake as tables) using Python.
  • Developed, maintained, and improved as needed, over 200 DQ checks which ran were schedule and run on a daily basis.
  • Performed data analysis to retrieve the data from snowflake tables, SQL to retrieve data from database and exporting to Excel.
  • Ensured end to end process worked as expected by analyzing results in every step of the process via python, SQL, and excel

Environment: Python, PySpark, Snowflake SQL, Databricks, Microsoft Excel, MS Office.

Confidential, San Antonio, TX

Data Analyst

Responsibilities:

  • Responsible for supporting the Sensitive Data Management team in daily ongoing activities.
  • Responsible for gap analysis L2 validations on different Databases.
  • Provided support for the team in L2 validations and making sure there are PCI compliant.
  • Created reports on Pivot chart, linear regression charts, Pareto chart analysis chart.
  • Responsible for creating the status on remediation by creating the Power BI reports.
  • Created Power BI reports for the weekly updates on the remediation.
  • Responsible for extracting the data and loading the data using the Python.
  • Responsible for creating Gap analysis templates for various databases and creating reports for the client
  • Updating the weekly project status report in RTC application
  • Participating in daily standup meetings with client.
  • Provided support for the team in PCI Scanning and Remediation for the different group of file owners.
  • Created reports for the group of file owners and making sure there are successfully remediating the files as per compliance.
  • Responsible for extracting the required Information from data sources using SQL/Unix scripting
  • Responsible for design, development & maintenance of ongoing analysis, reports, dashboards for the key business decisions.
  • Responsible for Identify, analyze and Interpret trends or patterns in complex datasets.
  • Responsible for helping the team in daily ongoing issues as per the business needs & requirements.

Environment: Power BI, Tableau, SQL server, Python, MS Access database, Netezza, SharePoint, Unix Shell scripting, Putty, SAS, Microsoft Visio, Microsoft Excel, MS Office.

Confidential, Chicago

Data Analyst

Responsibilities:

  • Supported business users and technical teams with ongoing ESP portfolio project
  • Responsible for analyzing the BARS data for the BARS billing project
  • Responsible for mapping the data from source to destination systems.
  • Used JIRA for Issue tracking, defect resolution and project traceability
  • Responsible for documenting the SQL scripts and presenting in the Visio diagram and making the reports.
  • Having good exposure on Enterprise data warehouse (EDW) and Big data lake domain and SQL
  • Performed source system analysis within the Healthcare service corporation. ( Confidential )
  • Performed data source layout with tables, files and structures.
  • Provided support for the technical teams in analyzing the data.
  • Responsible for gathering the detailed business requirements based on the project needs.
  • Involved in reviewing the business requirements and analyzing the data from source to target.
  • Worked on customer billing information of BARS and extracted data from various sources.
  • Created tableau dashboards/reports for the business users.
  • Involved with end users to gain an understanding of Information and core data concepts behind their business.
  • Responsible for fulfilling ad - hoc requests according to user specifications by utilizing data management software programs and tools like Perl, MS Access, Excel and SQL.

Environment: Teradata SQL Assistant, Big data lake, AWS SCT, Python, AWS Redshift, Tableau desktop, DB2 mainframes, SharePoint, Unix Shell scripting, Putty, GitHub, Microsoft Visio, Microsoft Excel, JIRA and MS Office

Confidential, Mclean VA

Data analyst

Responsibilities:

  • Responsible to make the design, development and administration of Branch Operations Compliance and Audit reporting.
  • Responsible for updating branch data change log as per to the new request weekly/ monthly.
  • Responsible for updating monthly branch RD DEPT DETAIL, Branch Inventory Listing, BRCH, Branch Locator file.
  • Responsible for branch opening/closing the branch, changes need to be made in all four data sources.
  • Responsible for loading the weekly data in local repository SAMGW by executing UNIX commands.
  • Responsible for creating tableau dashboards with DA intake requests.
  • Wrote Python scripts to load/unload manipulate process and analyze data from diverse sources.
  • Experience creating AWS EC2 Linux instances using Cloud Formation Templates and store flat files.
  • Installed Python modules to analyze data on AWS EC2 Instances.
  • Involved in data migration processes to migrate historical data from legacy system into AWS Redshift (to S3 as flat files and Redshift as tables) using Python.
  • Responsible for refreshing tableau extracts, add/ modify existing reports with new requests.
  • Developed Teradata SQL queries and used Utilities such as BTEQ.
  • Hands on experience in MapReduce jobs and on Installing, configuring and administrating the Hadoop Cluster of Major Hadoop Distributions.
  • Performed data analysis by using Hive to retrieve the data from Hadoop cluster, SQL to retrieve data from database.
  • Involved in developing Hive DDLs to create, alter and drop Hive tables.
  • Having knowledge on Data Warehousing using Informatica Power Center 9.1, 8.6.1, 9.x, 8.x, 7.x, OLAP, OLTP environments.

Environment: Tableau Developer (Desktop/Server) Teradata SQL Assistant, AWS Redshift, Python, GitHub, MS office, Agile, Windows, UNIX, BTEQ, SAS desktop, SAS EG, Putty, ETL, Informatica Power Center 9.1, 8.6.1, 9.x, 8.x, 7.x, HDFS, Hive, and No SQL.

Confidential, Tampa FL

Data Specialist

Responsibilities:

  • Worked with business analysts, senior project managers, and programmers to gather business Requirements and specifications.
  • Extract up to date accounts data from Teradata database by using BTEQ.
  • Developed the Interfaces in SQL, for data calculations and data manipulations.
  • Created Power Point presentation (Financial data, charts).
  • Used MS Excel and Teradata for data pools and ADHOC reports for business analysis
  • Performed Verification, Validation, and Transformations on the Input data (Text files) before loading into target database.
  • Performed in depth analysis in data & prepared weekly, biweekly, monthly reports by using Teradata MS Excel and UNIX.
  • Define parameters for data integrity and compliance by performing data cleansing, data audit and/or data validation.
  • Created SAS reports in external locations using DATA NULL statement with FILE and PUT statement
  • Developed SAS Program for Converting Large volume of Text File into Teradata tables by importing the text file from Mainframes to Desktop.
  • Designed scripts in SAS to be compatible with Teradata, to load and access data from the Teradata tables.
  • Generated reports using BASE SAS procedures - PROC PRINT, PROC SORT, PROC TABULATE, PROC FREQ, PROC MEANS, PROC SUMMARY
  • Provide input for the data governance requirements and standards.
  • Plan and define criteria and make go/no go decisions from a quality perspective.
  • Define test data needs in partnership with IT and provide technical support on data warehouse teams.
  • Responsible for understanding business analysis concepts for logical data modeling, data flow processing and data base design.

Technical Skills: Teradata, BTEQ, Python, Mainframes, SAS/BASE, SAS/ODS, SAS/GRAPH, SAS/ACCESS, FLOAD, UNIX, Windows XP, Quality Center, SQL Assistant

Confidential, GA

System Analyst

Responsibilities:

  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Involved in reviewing business requirements and analyzing data sources form Excel/Teradata for design, development, testing, and production rollover of reporting and analysis projects within Tableau Desktop.
  • Created views in Tableau Desktop that were published to internal team for review and further data analysis and customization using filters and actions
  • Worked on claims data and extracted data from various sources such as flat files, Teradata and Mainframes.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Created Bullet graphs to determine profit generation by using measures and dimensions data from Teradata and MS Excel.
  • Blended data from multiple databases into one report by selecting primary keys from each database for data validation.
  • Worked with end users to gain an understanding of information and core data concepts behind their business.
  • Assisted in defining business requirements for the IT team and created BRD and functional specifications documents along with mapping documents to assist the developers in their coding.
  • Identify & record defects with required information for issue to be reproduced by development team.

Environment: Teradata SQL assistance, MS office, Business Objects Clear Quest, SharePoint, Erwin, Clear Case, Teradata R13, Tableau Desktop.

Confidential, Houston, TX

Tableau Developer

Responsibilities:

  • Developed Tableau dashboards according to user specifications.
  • Extensively used Data Blending techniques in dashboard development.
  • Supported different user groups and business domains.
  • Good knowledge of tableau server, administrative functions, installation, configuration, back up of servers and load balancing techniques.
  • Defined architecture for Tableau- establish Dev/QA/Prod environments
  • Responsible for security configuration including user/ group setup, permissions, security roles, configuration of trusted ticket authentication
  • Setup new projects, security groups, roles and administer users for all supported platforms
  • Administered user, user groups, and scheduled instances for reports in Tableau.
  • Monitoring of Tableau Servers for its high availability to users.
  • Worked on Tableau Server upgrades for new versions.
  • Providing Tableau Demo's to new on boarding users.
  • Developed worksheets and data visualization dashboards with graphs and filters.
  • Created report schedules on Tableau server.
  • Develop monthly reports and dashboards.
  • Created calculated fields as per business requirement.
  • Defined best practices for Tableau report development.

Environment: Tableau Server 6.0.10, 7.0, Tableau Desktop, Oracle, Teradata, Excel, Windows 2008 R2.

Confidential

SQL Developer

Responsibilities:

  • Actively involved in business meetings with the customers for understanding the as-is process flow and new business requirements or enhancement activities.
  • Involved in logical modeling and physical modeling of application.
  • Created standard filters, prompts, calculations and conditions.
  • Modified and updated existing custom forms to adapt to new database schema.
  • Planned complete back up of database and restored the database from Disaster Recovery.
  • Developed data model, SQL Queries, SQL Query tuning process and Schemas.
  • Used Toad for creating PL/SQL (trigger, sequence, stored procedure).
  • Generated various analytical reports using Microsoft Access.

Environment: Oracle 11g, Oracle Application Servers, UNIX Server 4.3, Power Builder 12, Windows NT, SQL*LOADER, and PL/SQL Developer.

We'd love your feedback!