We provide IT Staff Augmentation Services!

Data Analyst/ Systems Analyst Resume

4.00/5 (Submit Your Rating)

Glen Allen, VA

SUMMARY

  • Over 8+ Years of experience in Information Technology as Data Analyst/Systems Analyst/SAS Reporting Analyst/Data Integration Engineer/QA Analyst using Teradata, SAS, Oracle and SQL Server RDBMS.
  • Hands on experience in importing, cleaning, transforming, and validating data and making conclusions from the data for decision - making purposes.
  • Detailed understanding of SDLC methodologies (Agile, Scrum, RUP, Waterfall)
  • Experience in running PL/SQL, UNIX, Informatica Workflows and SAS scripts and troubleshooting issues
  • Hands on Experience in Developing and maintaining dashboards/reports using Tableau
  • Experience in data loading from one database to another database
  • Worked and extracted data from various database sources like Oracle, SQL Server and Teradata.
  • Experience with data warehousing techniques like Slowly Changing Dimensions, Surrogate key, Snow flaking etc. Worked with Star Schema, Data Models, E-R diagrams and Physical data Models.
  • Expert in Data Conversion, Normalization, Multi-Dimensional Modeling and involved in creation of Fact Tables, Dimension Tables, Star Schema and Snowflake dimensional schema using Erwin Tool.
  • Experience in running scripts in SAS and generating reports using Pivot tables
  • Experience in Writing BASE SAS Programs for converting Teradata table into Flat files (CSV, Fixed Format, delimited etc.)
  • Experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation.
  • Expertise in OLTP/OLAP System Study, Analysis and E-R modelling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling.
  • Extensive SQL experience in querying, data extraction and data transformations.
  • Good experience in loading Data Files in AWS Environment and Performed SQL Testing on AWS redshift databases.
  • Experience in developing code using Spark, AWS, Databricks to conduct the analysis of billions of customer records, generate the report and automate the process .
  • Experience in preparing Business reports using Teradata SQL, and Power Points
  • Experience in developing data applications with Python in Linux/Windows and Teradata environments
  • Experience in Data Validation and Verification and documenting the results for QA Testing.
  • Extensive experience on Excel, Word, PowerPoint, MS Project, MS Visio, Rational Suite and SQL, PL/SQL.
  • Experienced in Financial (Credit card and Banking) and Health Insurance Domains.
  • Highly motivated self-starter with excellent communication, presentation and interpersonal skills, can perform well both independently and with a team, always willing to work in challenging and cross-platform environments.

TECHNICAL SKILLS

RDBMS: Oracle, SQL Server, Teradata, MS Access

Tools: Toad, SQL Server, AWS, EC2,UNIX, Selenium, Informatica Power Center, Teradata SQL Assistant, tableau, SQL*Plus

Languages: SQL, PL/SQL, UNIX Shell scripting, Python, spark BASE SAS

GUI: MS Office Suite, Ultra Edit, Tableau

OS: Windows XP, HP-UNIX, Linux

PROFESSIONAL EXPERIENCE

Confidential, Glen Allen, VA

Data Analyst/ Systems Analyst

Responsibilities:

  • Deployed to run the test scripts and informatica workflows when new patch to the oracle application.
  • Gather the reports and share that test reports to concerned Team
  • Data sources are extracted, transformed and loaded to generate CSV data files with Python programming and SQL queries.
  • Experience in Python scripts to parse XML documents and load the data in database
  • Tabulate the data regarding Drug Price, patient enrollment, service provider information.
  • Created SQL database tables and views for the data collected from the field.
  • Worked on ad-hoc requests regarding drug price, claims, Physician details and service provider details.
  • Worked in AWS Environment for loading data files from Legacy UNIX Systems to EC2 Instances.
  • Designed rich data visualizations to communicate complex ideas to customers or company leaders using Tableau Software.
  • Experience in UNIX scripts to load the data and trouble shoot the job failures.
  • Worked on performing data availability check in AWS Redshift database comparing between Teradata tables and Redshift tables.
  • Created presentations for data reporting by using pivot tables, VLOOKUP and other advanced Excel functions.
  • Use existing procedures to solve routine and standard problems.
  • Responsible to trouble shoot the job failures and rerun the jobs
  • Developed Metadata Models in Tableau and dashboards and transferred to destination through SSIS(ETL).
  • Customized data by adding Filters Calculations, Prompts, Summaries and Functions.
  • Developed reports by interacting with the data sitting in S3 buckets using Pandas.
  • Developed Python code in Jupiter notebook importing the data from files and saving the generated data in the form of data frames.
  • Developed Tableau visualizations and dashboards using Tableau Desktop.
  • Blended data from multiple databases into one report by selecting primary keys.
  • Fine tuned SQL Queries for maximum efficiency and performance.

Environment: Toad. Python, SAS, SQL Server, Tableau, EC2, MS Excel, UNIX, AWS Redshift, MS Office, HP-Unix.

Confidential, Glen Allen, VA

Business Analyst/ Data and Integration Engineer

Responsibilities:

  • Worked on Weekly by Call rotation for the Production Support
  • If any document issues came across during on call and document, share with Team
  • Prepare a document for oracle patch load testing process
  • Responsible to run test scripts when new patch to the oracle application is deployed.
  • Responsible to trouble shoot the job failures and rerun the jobs
  • Collect the reports and share the test reports to concerned Team
  • Responsible to run the UNIX scripts to load the data and trouble shoot the job failures.
  • Check and make sure whether all the FDB loads are completed successfully or not.
  • Experienced on running scripts in SAS to load the data daily from oracle Database to SQL server database for 17 clients.
  • Load the Fast Mac Baseline Prices into SQL Server Database.
  • Check and make sure the UAC data is loaded into SQL Server Database.
  • Responsible to make sure the FMT data is loaded into Oracle Database.
  • Responsible to make sure the FastMac data is loaded into SQL Server Database.
  • Experienced on Data Management by making sure whether Data is loaded in the First Decision Database.
  • Tabulate the data regarding Drug Price, patient enrollment, service provider information. worked on ad-hoc requests regarding drug prices, claims, Physician details and service provider details and document them

Environment: Toad. SAS, SQL Server, MS Excel, MS Office, HP-Unix.

Confidential, Richmond, VA

Data Analyst

Responsibilities:

  • Worked on running scripts in SAS Enterprise Guide and generate Daily, Weekly and Monthly reports (Bank Ops Dashboards) and publish in KL
  • Experience in Data Management by making sure whether Data is loaded in the BDW before running reports by running Teradata SQL scripts
  • Contact the BDW Support Team if the data is not updated for the status
  • Import the cash files from BDW to ‘G’ Drive by running Recon Plus Macro
  • Transfer required files to SAS dome from ‘G’ Drive before running SAS Scripts.
  • Import the data from SAS dome and BDW before running SAS Process
  • Experience in Data Validation, after generating reports making sure the correct data is pulled or not, before publishing the reports in the KL
  • Responsible for generating reports by using Pivot tables
  • Gathering, Analyzing Business requirements and developing reports by creating SQL Scripts, Excel to provide Business solutions.
  • Involved in Business Meetings for Requirement gathering and analyzing.
  • Worked with BA to provide Business Performance Data using Teradata, SQL.
  • Created Volatile Tables to get the Data from Multiple Tables.
  • Created Multi set Tables by using Volatile Tables.
  • Extracted data from existing data source and performed ad-hoc queries by using SQL.
  • Extensively used INNER and OUTER Joins while extracting data from Multiple Tables.
  • Improved Performance of the SQL Queries by Creating Primary Index and Collecting Statistics on Index Columns.
  • Developed Ad-hoc reports as per Business Analyst, and Project Manager Data request.
  • Provided data to Business analyst in graphical format by using MS Excel

Environment: Teradata, SQL, MS Excel, MS Office, SAS Enterprise Guide, HP-Unix.

Confidential, Charlotte, NC

Data Analyst/ Business Analyst

Responsibilities:

  • Worked with Business Analysts to Provide Business Performance Data and other data using Teradata, Oracle, and SQL.
  • Responsible for Gathering, Analyzing Business requirements and developing reports by creating SQL Scripts, PIVOT Tables in Excel and Power Point presentations to provide Business solutions.
  • Created Daily, Weekly, Monthly and Quarterly reports related to financial departments using Teradata, MS Excel, and UNIX.
  • Wrote BTEQ SQL scripts for large data pulls and ad hoc reports for analysis.
  • Created numerous tables like Set, Multi set, Derived, Volatile, Global Temporary tables, Macros and Views using Teradata SQL.
  • Extensively used INNER and OUTER Joins while extracting data from Multiple Tables.
  • Used Explain Plan Statement in Teradata before Querying Large tables with Millions of records and with several joins.
  • Performed data analysis to look at credit exposures, risk profiles and trends of various business programs.
  • Improved Performance of the SQL Queries by Creating Primary Index and Collecting Statistics on Index Columns.
  • Extensively worked on Unix Shell Scripts for automating the SQL scripts, checking daily logs and emails with status updates.
  • Automated Excel Sheets to refresh the data directly from Teradata tables using ODBC connection.
  • Created BASE SAS programs to load data from Excel, CSV, TAB and other delimiter Text Files into Teradata Tables.

Environment: Teradata, SQL, BTEQ, Oracle 9i, Windows XP, PowerPoint, Excel.

We'd love your feedback!