We provide IT Staff Augmentation Services!

Data Analyst Resume

2.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY:

  • 8+years of experience as Data Analyst/Big Data Engineer
  • Proficient inData Analysis, Cleansing, Transformation, Data Migration, Data Integration, Data Import, andData Exportthrough use of ETL tools such as Informatica.
  • Analyzed data and provided insights wif R Programming and Python Pandas
  • Expertise in Business Intelligence, Data warehousing technologies, ETL and Big Data technologies.
  • Experience in Creating ETL mappings using Informatica to moveData from multiple sources like Flat files, Oracle into a common target area such asData Warehouse.
  • Experience in writingPL/SQLstatements - Stored Procedures, Functions, Triggers and packages.
  • Involved in creating database objects like tables, views, procedures, triggers, and functions using T-SQL to provide definition, structure and to maintain data efficiently.
  • Skilled in Tableau Desktop versions 9.x/8.x for data visualization, Reporting and Analysis.
  • Developed reports, dashboards using Tableau for quick reviews to be presented to Business and IT users.
  • Extensive knowledge in various reporting objects like Facts, Attributes, Hierarchies, Transformations, filters, prompts, Calculated fields, Sets, Groups, Parameters etc., in Tableau.
  • Hands on learning wif different ETL tools to get data in shape where it could be connected to Tableau through Tableau Data Extract.
  • Expertise in writingcomplex SQL queries, made use of Indexing, Aggregation and materialized views to optimize query performance.
  • Performed predictive Modeling, Pattern Discovery, Market Basket Analysis, Segmentation Analysis, Regression Models, and Clustering.
  • Experience in working wif SAS Enterprise Guide Software for reporting and analytical tasks.
  • Experience in utilizing SAS Procedures, Macros, and other SAS application for data extraction using Oracle and Teradata.
  • Expertise in using Qlikview
  • Cloudera certified developer for Apache Hadoop. Good knowledge of Cassandra, Hive, Pig, HDFS, Sqoop and Map Reduce.

TECHNICAL SKILLS:

Programming Languages: SAS, R Programming, Python, Matlab, VB, Java, C, C++, SQL, MySQL, PL/SQL

ETL Tools: Informatica Power Center 9.1/8.6 (Designer, Workflow Manager/ Monitor, Repository), Abinitio

Web Programming: HTML, CSS, JavaScript

Testing Tools: HPQC, HPQTP, HPLoadRunner, HP ALM, IBM Clear Quest, IBMRQM, Jira, MTM, SDLC

Database Tools: Oracle SQL Developer, Toad, Oracle 10g/11g, MS SQL Server 2005/2008, SSIS, SSRS, Data Grid

Big Data Technologies: Cassandra, Pig, Hive, HDFS, Map Reduce, Sqoop, Yarn

BI and Analytics Tools: OBIEE, Oracle Reports Builder, Tableau, Pandas, Seaborne, Matplotlib, Cognos, Excel, SAS, SAS Enterprise Miner

Operating System/Framework: Windows, Macintosh, UNIX, Hadoop

Data Modeling: Regression Modeling, Time Series Modeling, PDE Modeling, Star-Schema Modeling, Snowflake-Schema Modeling, FACT and Dimension tables

Cloud Tools: Google Cloud Platform, Google Big Query

PROFESSIONAL EXPERIENCE

Data Analyst

Confidential, Dallas, TX

Responsibilities:

  • Devised simple and complex SQL scripts to check and validate Dataflow in various applications.
  • Performed Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Exportthrough Python.
  • Devised PL/SQL/Pig/Hive statements - Stored Procedures, Functions, Triggers, Views and packages. Made use of Indexing, Aggregation and Materialized views to optimize query performance.
  • Developed logistic regression models (using R programming and Python) to predict subscription response rate based on customers variables like past transactions, response to prior mailings, promotions, demographics, interests and hobbies, etc
  • Created Tableau dashboards/reports for data visualization, Reporting and Analysis and presented it to Business.
  • Created Data Connections, Published on Tableau Server for usage wif Operational or Monitoring Dashboards.
  • Knowledge in Tableau Administration Tool for Configuration, adding users, managing licenses and data connections, scheduling tasks, embedding views by integrating wif other platforms.

Big Data Engineer/Analyst

Confidential, CA

Responsibilities:

  • Gathering data and business requirements from end users and management. Designed and built data solutions to migrate existing source data in Teradata and DB2 to Big Query (Google Cloud Platform).
  • Performed data manipulation on extracted data using Python Pandas.
  • Work wif subject matter experts and project team to identify, define, collate, document and communicate teh data migration requirements.
  • Design scoop scripts to load from Teradata and DB2 to Hadoop environment and also design Shell scripts to transfer data from Hadoop to Google Cloud Storage (GCS) and from GCS to Big Query.
  • Validate Scoop jobs, Shell scripts & perform data validation to check if data is loaded correctly wifout any discrepancy. Perform migration and testing of static data and transaction data from one core system to another.
  • Develop best practice, processes, and standards for effectively carrying out data migration activities. Work across multiple functional projects to understand data usage and implications for data migration.
  • Prepare data migration plans including migration risk, milestones, quality and business sign-off details.
  • Oversee teh migration process from a business perspective. Coordinate between leads, process manager and project manager. Perform business validation of uploaded data.

Test Data Analyst

Confidential, Philadelphia-PA

Responsibilities:

  • Gathering data and business requirements from end users and management. Designed and built data solutions to migrate existing source data in Data Warehouse to Atlas Data Lake (Big Data)
  • Performed all teh Technical Data quality (TDQ) validations which include Header/Footer validation, Record count, Data Lineage, Data Profiling, Check sum, Empty file, Duplicates, Delimiter, Threshold, DC validations for all Data sources.
  • Analyzed huge volumes of data Devised simple and complex HIVE, SQL scripts to validate Dataflow in various applications. Performed Cognos report validation. Made use of MHUB for validating Data Profiling & Data Lineage.
  • Devised PL/SQLstatements - Stored Procedures, Functions, Triggers, Views and packages. Made use of Indexing, Aggregation and Materialized views to optimize query performance.
  • Created reports using Congas to perform data validation.
  • Worked wif senior management to plan, define and clarify dashboard goals, objectives and requirement.
  • Responsible for daily communications to management and internal organizations regarding status of all assigned projects and tasks.

Data Analyst/Tableau Developer

Confidential, Tampa-FL

Responsibilities:

  • Gatheird data and business requirements from end users and management. Designed and built data solutions to satisfy application requirements.
  • Developed reports, dashboards using Tableau 8.3 for quick reviews to be presented to Business and IT users.
  • CreatedScatter Plots,Stacked Bars, Box and Whisker plots using reference, Bullet charts, Heat Maps, Filled Maps and Symbol Maps,Pareto chartsaccording to deliverable specifications.
  • Implanted teh analyzed data into Tableau and show teh regression, Trend and forecast in teh dashboard for teh classified industries which was considered.
  • Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet in Tableau Desktop.
  • Developed metrics, attributes, filters, reports, dashboards and also created advanced chart types, visualizations and complex calculations to manipulate teh data.
  • Analyzed huge volumes of data. Experience wif various ETL, data warehousing tools and concepts. Created data warehouse design.
  • Involved in extensive data validation by writing several complexSQLqueries and Involved in back-end testing and worked wif data quality issues.
  • Created source to target data mappings, business rules, and business and data definitions.
  • Developed dimensions and fact tables for data marts like Monthly Summary, Inventory data marts wif various Dimensions like Time, Services, Customers and policies.
  • Used Informatica & SAS to extract, transform & load source data from transaction systems, generated reports, insights, and key conclusions.

Data Analyst

Confidential

Responsibilities:

  • Executed quantitative analysis on chemical products to recommend effective combinations
  • Performed statistical analysis using SQL, Python, R Programming and Excel.
  • Import, clean, filter and analyze data using tools such as SQL, HIVE and PIG.
  • Used Python& SAS to extract, transform & load source data from transaction systems, generated reports, insights, and key conclusions.
  • Manipulated and summarized data to maximize possible outcomes efficiently
  • Analyzed and recommended improvements for better data consistency and efficiency
  • Designed and Developeddata mapping procedures ETL-Data Extraction,Data Analysis and Loading process for integratingdata using R programming.
  • Effectively Communicated plans, project status, project risks and project metrics to teh project team planned test strategies in accordance wif project scope.

Data Analyst

Confidential

Responsibilities:

  • Headed negotiations to find optimal solutions wif project teams and clients
  • Mapped client business requirements to internal requirements of trading platform products
  • Supported revenue management using statistical and quantitative analysis, developed several statistical approaches and optimization models.
  • Managed and maintained required documentation
  • Led teh business analysis team of four members, in absence of teh Team Lead
  • Added value by providing innovative solutions, and delivering improved upon methods of data presentation by focusing on teh Business need and teh Business Value of teh solution.

We'd love your feedback!