We provide IT Staff Augmentation Services!

Big Data Analyst Resume

2.00/5 (Submit Your Rating)

Charlotte, NC

TECHNICAL SKILLS

Programming Languages: Java, PL/SQL, JCL, COBOL

Analytics Languages: R, Python, SQL, Scala, SAS

IDE: RStudio, Eclipse, Intellij, PyCharm, PySpark, Weka

AWS Technologies: S3, EC2, SQS, SNS, EMR

Databases: Mainframe, Oracle, NoSQL, MySQL, ETL

Web Development: Java Script, HTML, CSS

Big Data Technologies: MapReduce, Hive, Spark, HBase, Pig, Yarn, Apache Azure

Data Visualization: Tableau Desktop/Server, Weka,Power BI, Pivot Table, VBA, V - lookup

SAS Skills: SAS-BASE, GRAPH, MACRO, SQL, ODS, STAT, MINER

Competencies: Logistic and Linear Regression, Time Series Analysis, CHAID,Factor Analysis, CART, Survival Analysis

PROFESSIONAL EXPERIENCE

Confidential, Charlotte, NC

Big Data Analyst

Responsibilities:

  • Employed time series algorithm on parts sales to visualize future scope of part and vehicles requirement to estimate required availability of various parts of vehicle in inventory and generate cost saving successful inventory management plan for team.
  • Performeddatamining of billing history database and identified significant previously unbilled
  • Developed Microsoft Access applications to automate reporting tasks, e.g. daily customer enrollment status reports, billing histories, billing variance reporting and energy usage histories.
  • Extracted, compiled, formatted, reconciled and submitted the physical energy salesdata for the Federal Energy Regulatory Commission (FERC) Electric Quarterly Report (EQR). Significantly reduced report creation time by designing and developing a Microsoft Access application to merge, consolidate, and format salesdatafrom over 88 Microsoft Excel spreadsheets. Created additional applications to formatdata extracted from the Allegro, ZaiNet and nMarket systems.
  • Extracted all data from the existing billing system and associated systems as part of a new billing system implementation (Banner, 9-month project). Created queries that combined, manipulated and exported the data to Excel workbooks. Created various data verification and compression processes.
  • Designed and created back office accounting reports via the report-writer function of the ZaiNet energy trading, scheduling and risk management system.
  • Created ad-hoc and production property preservation SQL reports. Imported property inspection orders and cancellations from banks and mortgage companies, and returned the inspection results.
  • Created and maintained process and procedure documentation.
  • Provided special reporting and ad hoc reporting (scholar/fund data /market values/book values) as needed. Researched and resolved data integrity issues.
  • Initiated, compiled, and communicated the merge of Word and OneNote process and procedure documents into one resource file.
  • Initiated and maintained many improvements to multiple FileMaker Endowed Scholarship databases resulting in significant time-savings and increased efficiencies:
  • Created a multi-script process pulling data from multiple databases to be merged into one pdf file for endowed scholarship donor acknowledgement and reporting.
  • Designed and created a FileMaker scholar thank-you letter proofing layout which increased colleague efficiency and timeliness. Created additional layouts that mirrored Dartmouth stationery eliminating the need for letter-head stock.
  • Created scripts which identified all funds supported by a household (e.g. husband and wife with multiple and separate funds) and then produced the household's scholar announcement letter.
  • Merged the data from two separate scholarship databases into one database resulting in a significant maintenance time-savings.
  • Performeddata updates to Scholarship fund, Monitored fund, In Memory Of and Prizes/Awards databases by importingdata from multiple FileMaker and Oracle database sources (Financial Aid Office, Advance,dataWarehouse and iModules).
  • Provided technical and software support as well as training (FileMaker, Excel, Acrobat Pro, printing and processes) to colleagues.
  • Extracted, compiled and tracked data, and analyzed data to generate reports in a variety of layoutsExcel, PDF, Tableau and SAS dashboard and Modelled data structures for multiple projects using Mainframe and Oracle
  • Maintained the data integrity during extraction, ingestion, manipulation, processing, analysis and storage.
  • Presented more than 15 impactful visualization time series dashboards and stories by employing Tableau desktop and server, Excel, pivot tables, Power BI, SAS and Visual Basic macros
  • Presented more than 15 impactful visualization time series dashboards and stories by employing Tableau desktop and server, Excel, pivot tables, SQL queries, Power BI, SAS and Visual Basic macros
  • Worked with Java team to accommodate the changes in the front end.
  • Modified the PL SQL packages for better performance of the jobs and batch processes.
  • Responsible for enhancements of data model according to the business requirements.
  • Developed the scripts for creating the sequences in all the databases that will accommodate the extended enterprise key in it.
  • Analyzed and developed the performance improvements of required data and tables.
  • Deployed the changes to various environments and tested the changes.
  • Worked on the QA and staging builds in TFS and merged all the builds based on the requirement.
  • Enhancements of already existing models that reduces the data redundancies and improve the functionality.
  • Worked on the parent and child hierarchy relating to the keys and created the scripts for their sequences according to their levels so that data migration from each database will have no big challenges.
  • Worked with Java team to accommodate the changes in the front end based on the changes we make in the tables in the database.
  • Modified the PL SQL packages for better performance of the jobs and batch processes.
  • Worked on the global master data in the production databases and analyzed the column length definitions, maximum values of the primary keys and their differences.
  • Modelled advanced visual basic application macros on various vendor data reports to plan data usage structure with minimum overage cost using Excel and VBA.
  • Employed time series algorithm on data usage to visualize future scope of data plan usage and overage estimation to generate cost saving successful data usage plan for teams.
  • Created backdating data plan process to avoid non-required data overage cost for 5 teams by the help of VBA, Tableau and SQL query report generation
  • Performed data analysis and data profiling using complex SQL queries on various sources systems.
  • Developed share point documentation template to support findings, project status and assign specific tasks.
  • Involved with data profiling of multiple sources using SQL Management Studio and presented initial discovery in Excel tables and reports.
  • Used project management tools such as Kanban and Share point to keep stakeholder updated about project.
  • LeveragedSentiment analysis to established consumersfeedback systemby MapReduce and text mining in java.
  • Developed and systematizedend-to-end statistical model on high-volume data sets by manipulating data using Hivequeries and Spark on Hadoop for faster results, and resolve issues of stakeholders under tight deadlines
  • Business requirement gathering through one-to one and group meeting with Vendors, Order Management team and Supply Chain team. Presented initial developed KPI frameworks to gain line of sight project.
  • Worked on altering the tables based on the requirements to improve the performance of the data processes.
  • Worked on the analysis of different levels of data in the production and extracted the parent level data on application basis to migrate it into the new database.
  • Worked on converting the already existing data based on expanded column length of the primary key in around 300 tables.
  • Worked on modifying the PL/SQL code, replacing the table with joins between the source tables.

Environment: Oracle 11g/12c, Sybase power designer, Windows7, SQL, PL SQL, Toad, TFS, MS Visio

We'd love your feedback!