Data Analyst/python Developer Resume
Richmond, VA
SUMMARY
- Data Analyst with 6+ years of experience.
- Excellent understanding of AWS with good proficiency in scripting using python.
- Good understanding of various Analytical and Reporting tools with strong analytical and communication skills.
TECHNICAL SKILLS
Technologies: Tableau 8.0/8.1/8.2/9.1/9.2 , Python, TSQL, SAS, SAMGW, UNIX, CRON Job, JIRA, Version One.
Languages: C, C#, VB, Java.
Web Technologies: HTML, CSS, XML, JavaScript
Databases: Amazon Redshift, Teradata V2R5.x/6.x/12/13/14/15, Oracle 11,10g, MS Access, SQL Server 7.0/2000/2005.
Platforms: Windows 95/98/NT/2000/XP, Unix, Linux
Miscellaneous Tools: MS Office, TOAD, SQL Assistant, BTEQ, FLoad, Mload, TPump.
Domains: Telecom, Health Care, and Financial.
PROFESSIONAL EXPERIENCE
Confidential, Richmond, VA
Data Analyst/Python Developer
Responsibilities:
- Automated one of the Enterprise Operational Risk Management Reports being a part of Risk Analytical Solutions team.
- Used openpyxl module in python to format excel files.
- Used python win32com.client library to write macros as a replacement for visual basics in excel.
- Wrote scripts in python that would pull data from Red shift database, manipulate the data as per the requirement by writing necessary conditional functions and store it in data frames.
- Loaded the data from the pandas data frames to the team's user defined space in Redshift database by using copy command from AWS S3 bucket.
- Built Tableau visualization to show the final output to the customers that catered the needs of their business .
- Created Action filters for the Tableau dashboard to be interactive.
- Followed the necessary guidelines in tableau for better performance such as, extracting data in Tableau from the database as a view rather than a custom SQL query, aggregating the data to test the functionality before loading the complete data.
- Scheduled the reports for quarterly refresh in SAMGW server.
- Played key role in Data movement to the cloud(Oracle OBIEE to AWS Redshift) by being a part of HR Data management team.
- Parsed XML files, JSON documents to load the data from them into database by scripting in python.
- Extensively used python modules like numpy, pandas, xmltodict, pycompare, datetime and SQL alchemy to perform data analysis.
- Managed storage in AWS using S3, created volumes and configured snapshots.
- Created EC2 instances to run automated python scripts.
- Automated EC2 instances using AWS cloud formation templates.
- Wrote python scripts to validate and test source to target mapping (STTM) migration from Oracle to Redshift.
- Implemented ETL logic in python which was originally written in Scala.
- Used Hydrograph as an ETL tool for loading the data.
Environment: AWS (EC2, S3, Redshift & CFT), Python, Oracle OBIEE, Hydrograph, SAMGW server, XML&CSV files, Scala.
Confidential, Atlanta, GA
Data Analyst/Python Developer
Responsibilities:
- Played key role in Gathering Business Requirements, Systems Requirements, Design, Gap Analysis, Test Criteria’s, Key performance metrics, Metadata, use case diagrams, flow charts, Data Management and Implementation.
- Managed internal data, including identifying risks to data integrity, disaster recovery and restoration.
- Parsed JSON documents and loaded the data in database using Python. Well versed with Python collections.
- Built various graphs for business decision making usingPythonmatplotlib library.
- Built a python class where the objects were batch jobs depending on their severity.
- Implemented python modules like numpy, Pandas, datetime to perform extensive data analysis.
- Identified and fixed cause(s) of the reported issues by checking batch loading andpythonscriptsscheduled as cron jobs.
- Scheduled SAS and BTEQ Scripts in UNIX CRON Job’s and then used tables in for reports.
- Created Daily, Weekly, monthly reports by using Teradata, SAS (PROC REPORT, PROC TABULATE and DATA NULL ) and automated with UNIX, Wrapper Unix Scripts.
- Managed storage inAWSusing Elastic Block Storage, S3, created Volumes and configured Snapshots.
- Analyzed and compareddata in Exceland prepared reports for managements.
- Experienced withAWSdatabackup (snapshot, AMI creation) techniques, along withdata - Confidential -rest security withinAWS and automation usingAWSCloud Formation Templates.
- Used MS Excel in Importing and exporting data from text files, saved queries or databases, used automatic outlining, inserted subtotals, created advanced filters, and used database functions.
- Created pivot tables and charts using MS Excel worksheet data and external resources, modified pivot tables, sorted items and group data, and refreshed and formatted pivot table
- Drew upon full range of Tableau platform technologies to design and implement proof of concept solutions and create advanced BI visualizations.
- Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps and Gantt charts.
- Developed and reviewed SQL queries with use of joins clauses (inner, left, right) in Tableau Desktop to validate static and dynamic data for data validation.
Environment: LINUX, UNIX Shell Scripting, Amazon S3, Tableau Desktop 9.1/9.0/8.2, Windows XP, Teradata 14.1, Python, MS Access, SAS, AWS, Flat files, CSV files, MS Excel, MS Visio, MS Word
Confidential, Columbia, MD
Data Analyst
Responsibilities:
- Understanding existing business model and customer's new requirements.
- Played key role in Gathering Business Requirements, Systems Requirements, Design, Gap Analysis, Metadata, use case diagrams, flow charts, Data Management and Implementation.
- Establisheddatastandardization/dataquality rules across alldatasources/subject areas in thedata warehouse.
- Established processes/solutions to identify and handledataissues and incremental/fully-refresheddata.
- Performed extensive data wrangling on the extracted data using Pandas and numpy modules in Python
- Used GIT for code sharing to save the committed changes that are made to the daily files by using different commands.
- Possess hands on experience in using Pycharm editor for writing the python scripts which also helped in code analysis, a graphical debugging, integrated unit testing etc.
- Experienced in Agile Methodologies, Scrum stories and sprints experience in aPython based environment, along with data analytics, data wrangling and Excel data extracts.
- WrotePythonmodules to extractdatafrom the Teradata source database
- Developed a fully automated continuous integration system using Git, Jenkins, and custom tools developed in Python.
- Implemented differentpythonlibraries like numpy, scipy, pytables in various tasks.
- Experience with event-driven programming inPython. Understanding of the threading limitations ofPython, and multi-process architecture.
- Analyzed streaming data using Python and R using packages like NumPy, NLTK and Pandas.
- Good knowledge and understanding on Amazon Web Services cloud services like EC2, S3, EBS and RDS.
- Worked on ADHOC requests & business reports by Extensive using Teradata, Oracle 11g and SAS/BASE, UNIX Scripting & Teradata utilities (FAST LOAD, MLOAD, and BTEQ), Excel.
- Scheduled SAS Job’s in CRON and Automated Complete Business Process.
- Drew upon full range of Tableau platform technologies to design and implement proof of concept solutions and create advanced BI visualizations.
- Extensively created visualization dashboards for the claims.
- Effectively used data blending, filters, actions, Hierarchies feature in tableau.
- Created extensive custom joins for blending data from several different data sources.
- Developed formatted, complex reusable formula reports and reports with advanced features such as conditional formatting, built-in/custom functions usage, multiple grouping reports in Tableau.
Environment: Tableau Desktop 9.1/9.0/8.2, Windows XP, SQL, Oracle 11g, MS Access, SAS, Flat files, CSV files, MS Excel, Python, R, AWS, MS Visio, Hadoop, Hive, UNIX Shell Scripting.
Confidential
Data/Business Analyst
Responsibilities:
- Created reports using Teradata, BTEQ for Business Requirements.
- Developed BTEQ, SQL queries in Extracting data from production database and built data structures, reports.
- Created History tables, views on the top of the Finance Data mart and production databases for the Business Analyst’s for their data analysis/ reporting needs by using Teradata, BTEQ, UNIX
- Generated number of query reports such as Master/Detail, Cross Tab and Chart (for trend analysis) using Business Objects Report Designer
- Worked on integration project end to end and worked on deployment & release.
- Created reports like, Reports by Product, Reports by Customer, Reports by Period, Demographic reports and Comparative reports.
- Used Web Intelligence for web based query and reporting
- Publishing reports using Info view and categorized it for different department to improve the performance and trained users for ad hoc reporting.
- The Historical Data was on Teradata as the main database.
- Developed data warehouse for business groups by using Teradata, BTEQ, SQL, UNIX to meet business requirements.
- Provided staging tables to the Business Analysts, Project managers in the marketing team for various flexibilities to make accurate analysis for their campaign research.
- Involved in creating the Daily run, weekly run and monthly run reports from universe.
- Performed Data validation, Data integrity before delivering data to operations, Business, Financial analyst by using Oracle, Teradata, BTEQ, Excel, MS Access.
- Developed SQL queries, Ad Hoc for Extracting data from production databases to build data structures, reports.
- Developed SQL, BTEQ (Teradata), Ad Hoc queries to fulfill business analyst requirements.
- Developed Ad hoc queries and scripts to meet the Business Analyst’s and the Operational Analyst’s requirements by using Teradata, UNIX, Oracle, SQL, FTP
- Debugging the existing reports for data errors, data grouping and sub reporting.
- Performed End-User training sessions and involved in knowledge transfer.
Environment: Teradata, SQL assistant 6.1, Teradata utilities (BTEQ, Fast Load, Fast Export) 6.01, Oracle, SAS, SQL, PL/SQL, UNIX, OLAP, Business objects XIR2 (CMC, CMS, CCM, Designer, Info view, Desktop Intelligence, Web Intelligence, Import Wizard, Report Conversion)