We provide IT Staff Augmentation Services!

Data Analyst Resume

2.00/5 (Submit Your Rating)

Baton Rouge, LA

CAREER OBJECTIVE:

Business/Requirements Analyst, Data Analyst, Database Mgmt., Data Warehouse

COMPUTER SKILLS

Languages: SQL Programming, PL/SQL, UNIX, HTML

Applications: Microsoft Project, Microsoft Visio, Microsoft Office, Business Objects, ETL Tools Novell, BMC Remedy Assets Management & AWS.

Database Software: Oracle Enterprise Data Quality for Product Data (PDQ), Oracle 11g, Microsoft Access, SQL Plus, SQL Server ManagementServers: Citrix, Cisco, Oracle, SQL Server 2008

Systems: Windows XP, Vista, Windows 7, Windows 10, Linux, Red - hat Linux

Other Skills: PC Hardware/Software, installation troubleshooting and repair.

Completed training in Database Management, Data Warehousing, Microsoft SQL, and Microsoft PL/SQL.

Cloud Computing: AWS, Azure, Anaconda Cloud, Elasticsearch, Solr, Lucene, Cloudera, Databricks, Hortonworks, Elastic MapReduce, AWS RDS, Amazon API Gateway, Python, Power BI, Microsoft BI Stack such as SSRS and SSAS

EXPERIENCE:

Confidential, Baton Rouge LA

Data Analyst

Responsibilities:

  • Tracking and cleansing Data within the Oracle Enterprise Data Quality for Product Data (PDQ).
  • Testing Data within PDQ.
  • Build and support the transformation of various Data inputs into the enterprise Data Warehouse. Analyze & design solutions, create technical specs for integration and consult with developers and QA personnel on their tasks related to the imports and extracts. Troubleshoot existing interface issues.
  • Responsible for final work product within assigned projects as related to Data, Data design, transformations and quality of results.
  • Lead successful project to stand up database for Individual Health Insurance Exchange applications and payments. Because of resource constraints, designed solution, created all database objects, wrote specs for integration, exports and reports, and handled Data issues.
  • Completed creation or revision of specs and Data flow documentation for 14 projects
  • Saved cost and hours on a project by revising how we report on certain Data
  • Discovered and corrected interface that was improperly sending private data.

Confidential, Philadelphia PA

Business Analyst

Responsibilities:

  • Create queries in SQL Server Management Studio generate reports to verify all Device exceptions within the organization, making sure they are deployed or decommissioned.
  • Conduct meetings and email teams to discuss solutions to verify inventory records.
  • Experience with Centers for Medicare and Medicaid Services, health care provider Data and online Data submission process.
  • Use state-of-the-art decision support and neutral network tools to detect potential fraud and support investigations organizing case files, research violations and accurately and thoroughly document all steps taken in project development
  • Used SQL Assistant front-end tool to issue SQL commands matching the business requirements to run reports for Data on Providers.
  • Provide Data analysis support to the fraud investigation team in support of their investigation leads.
  • Prepare reports working with database structures and Data modeling including working knowledge of SQL.
  • Conduct self-directed research to uncover problems in Medicare payments made to a variety of provider types including physicians, suppliers, hospital, and rehabilitation facilities.
  • Handle Strong healthcare material including experience with transaction standards (e.g. HIPAA).
  • Attend meetings, training, and conferences.
  • Servers, Applications and Software used: Citrix, Business Objects, MCS, MS Access, Excel, Word, Outlook.
  • Provide excellent customer care.
  • Gather, analyze, and documented Data requirements for projects of medium to high complexity and moderate to high risk, perform source system Data quality analysis.
  • Participate in the analysis of client business processes and functional or reporting requirements.
  • Created new database objects like Tables, Procedures, Functions, Indexes and Views
  • Conducted Data modeling review sessions for different user groups, participated in requirement sessions to identify requirement feasibility.
  • Participate in cross-functional task forces to identify and document functional or reporting
  • Performed numerous Data pulling requests using SQL for analysis.
  • Extracted Data from existing Data stores, Developing and executing departmental reports for performance and response purposes by using oracle SQL, MS Access, MS Excel
  • Servers, Applications and Software used: Citrix, IDX/ GE Managed Care Application, MS Access, Excel, Word.

Confidential

BI/Data Analyst Louisville, KY

Responsibilities:

  • Collected the business requirements from the subject matter experts like data scientists and business partners.
  • Used NoSQL databases like MongoDB in implementation and integration.
  • Worked on streaming the analyzed data to Hive Tables using Sqoop for making it available for visualization and report generation by the BI team.
  • Configured Oozie workflow engine scheduler to run multiple Hive, Sqoop and pig jobs.
  • Used Oozie to automate/schedule business per the requirements.
  • Used the image files of an instance to create instances containing Hadoop installed and running.
  • Developed a task execution framework on EC2 instances using SQL and Cassandra DB.
  • Designed a cost-effective archival platform for storing big data using Hadoop and its related technologies.
  • Connected various data centers and transferred data between them using Sqoop and various ETL tools.
  • Extracted the data from RDBMS (Oracle, MySQL) to HDFS using Sqoop.
  • Used the Hive JDBC to verify the data stored in the Hadoop cluster.
  • Used different file formats like Text files, Sequence Files, Avro.
  • Loaded data from various data sources into HDFS using Kafka.
  • Integrated Kafka with Spark Streaming for real time data processing
  • Transferred data using Informatica tool from AWS.
  • Developed modules to extract, process & transfer the customer data using Teradata utilities.
  • Monitoring the Order fulfillment Metrics jobs to make sure data being loaded into Teradata tables

We'd love your feedback!