We provide IT Staff Augmentation Services!

Sr. Data Analyst Consultant Resume

2.00/5 (Submit Your Rating)

Los, AngeleS

SUMMARY:

  • Data Scientist with 6+ years of experience in Data analysis/Business Intelligence/Reporting/Statistics Analysis, with excellent understanding of Data Warehouse and Data Mart designing.
  • Extensive experience in Informatica cloud services and Tableau, and creation and maintenance of database objects like tables, views, materialized views, indexes, constraints, primary keys, sequence, synonyms and database Link.
  • Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
  • Wide experience using MSSQL Server 2016/2014/2012/2008 R2/2008, DB2, Database Development, Design, System Analysis and Support of MS SQL Server, design, development, and Maintenance of MSBI - SSIS, SSRS
  • Extensively involved inDatapreparation, Exploratory analysis, Feature engineering using Supervised and unsupervised modeling.
  • Well versed with Linear/non-linear regression and classification modeling predictive algorithms and A/B testing
  • Performeddataprocessing using Python libraries like Numpy and Pandas and performeddatavisualization using ggplot2 library in R and MatplotLib in Python in the process of estimating Product Demand.
  • Actively involved in Model selection, Statistical analysis using Gretl Statistical Tool.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica graphs.
  • Good experience in Data Stage Administration, Information Server (IS).
  • Experience in Data Enrichment and Re-Engineering Using Quality Stage and DataStage.
  • Extensive ETL tool experience working with IBM Data Stage 7.5/8.1/8.5/8.7/9.1/11.3 and worked on DataStage client tools like DataStage Designer, DataStage Director and DataStage Administrator.
  • Experienced in scheduling sequence, parallel and server jobs using DataStage Director, UNIX scripts and scheduling tools.
  • Conversant with all phases of the Software Development Life Cycle (SDLC) especially Agile, Scrum, involving business process analysis, requirements gathering and analysis, detailed design, development, testing and post implementation support.
  • Strong expertise in SAS, SQL, SSIS, Excel, Access, VBA, PL/SQL.
  • Strong knowledge of Data Warehouse concepts and technologies such as ETL processes, dimensional modeling, Star and Snowflake Schemas, reporting tools and surrogate key generation.
  • Highly proficient in creating database objects like tables, indexes, views, user defined functions, stored procedures, triggers, cursors, data integrity and constraints.
  • Experience in Data cleansing by matching user introduced data with database data, removing duplicates, and extraction of relations from source systems using Quality Stage.
  • Expertise in SQL Server Analysis Services (SSAS) to deliver online analytical processing (OLAP) and data mining functionality for business intelligence applications.
  • Experienced in SQL Server Reporting Services (SSRS), SQL Server Integration Services (SSIS), Data Transform Services (DTS) and SQL Server Analysis Services (SSAS).
  • Knowledgeable in Designing, Developing and deploying various kinds of reports using SSRS using relational data.
  • Scheduled & distributed reports in multiple formats using SQL Server Reporting Services (SSRS) in Business Intelligence Development Studio.
  • Experienced in designing and developing large number of reports in SSRS using the data from ETL Loads, SSAS Cubes and various heterogeneous data sources.
  • Experienced in creating Jobs, alerts, SQL Mail Agent, scheduling packages using SSRS.
  • Documentation of use cases, solutions, and recommendations.
  • Work in a fast-paced agile development environment - in a team as well as independently.
  • Strong communication skills and willingness to take initiative to contribute beyond basic responsibilities
  • Ability to multitask, prioritize issues, and monitor progress along with data presentation skills.

TECHNICAL SKILLS:

Database Specialties: Database Architecture, Data Analysis, Enterprise Data Warehouse, Database Design and Modeling, Data Integration and Migration, ETL Architecture and Design, Data Warehouse, OLTP, OLAP

Modeling Tools: Erwin 9.x, Rational Rose, ER/Studio, MS Visio, SAP Power designer, Embarcadero.

Databases: MS SQL Server 2016/2014/2012/2008 R2/2005, Oracle 12c/11g/10g/9i/8i, Teradata R14/R13/R12, MS Access, MongoDb

Programming Languages: SQL, PL/SQL, UNIX shell scripting, PERL, Java, Scala

Operating Systems: Windows, UNIX, MS DOS, Sun Solaris.

Reporting Tools: Tableau 8.x/9.x, Crystal reports XI, Business Intelligence, SSRS, Business Objects 5.x / 6.x, Cognos7.0/6.0., Informatica Power Center 7.1/6.2, Ab-Initio, Data Stage

Web technologies: HTML, DHTML, XML, CSS.

Scripting Languages: Python, R, VBScript, JavaScript, UNIX Shell Script.

Project Execution Methodologies: Ralph Kimball and Bill Inmon data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD), SDLC Methodologies- Agile and Scrum

Cloud Technology: Hadoop, AWS Redshift, Snowflake, EC2, S3, Azure

Tools: MS-Office suite (Word, Excel, MS Project and Outlook), TOAD, BTEQ, Fast Load, Multi Load, Fast Export.

PROFESSIONAL EXPERIENCE:

Confidential, Los Angeles

Sr. Data Analyst Consultant

Responsibilities:

  • Participated in the client requirement gathering and Design meetings, working with the researchers and clients closely.
  • Interacted with offshore BI development team, for distribution of work.
  • Designed Complex SSIS packages to load the data from different sources like Oledb, Excel, Flat Files and DB2 etc. into various Data warehousing system.
  • Used PowerPivot to create reports and connect to SQL Server databases from Excel
  • Translate business needs into reporting and analytic requirements.
  • Performed Dimensionality reduction using near zero variance, correlation techniques and validated results with Hypothesis and T- tests.
  • Validate the consolidateddataand develop the model that best fits thedata. Interpretdatafrom multiple sources, consolidate it, and performdatacleansing using R Studio
  • Performed multipleDataMining techniques and derive new insights from thedata
  • Team player with good logical reasoning ability, coordination and interpersonal skills
  • Designed and developed Microsoft Access Databases, customized with macros and deployed on to SharePoint 2010 using Access Services
  • Team builder with excellent communications, time & resource management & continuous client relationship development skills.
  • Implemented action filters, parameters, calculated filed and set for preparing dashboards and worksheets inTableau.
  • CreatedTableauscorecards, dashboards using Stack bars, bar graphs, scattered plots, geographical maps, Gantt charts using show me functionality
  • Designed Parameterized Nested Reports based in the production log using SQL Server Reporting Services;
  • Designed cascaded Dynamic Statistical and Graphical Reports, Industry Reports using Report/Group Variables and represented as Pivot Tables, VLookups, Macros, Bar Charts and Gauges using SQL Server Reporting Services, Excel;
  • Designed Stored Procedures to perform Validation of Data in source files and Data Database; stored in SQL Server
  • Conducted data migration and manipulation from multiple sources to SQL server using SSIS
  • Used PowerPivot to create reports and connect to SQL Server databases from Excel
  • Translate business needs into reporting and analytic requirements.
  • Team player with good logical reasoning ability, coordination and interpersonal skills
  • Designed and developed Microsoft Access Databases, customized with macros and deployed on to SharePoint 2010 using Access Services
  • Team builder with excellent communications, time & resource management & continuous client relationship development skills.
  • Build interactive dashboards and publish Tableau reports utilizing parameters, calculated fields and table calculations, user filters, action filters and sets to handle views more efficiently
  • Worked on the development of Dashboard reports for Key Performance Indicators for the top management
  • Rebuilding existing Reports into Tableau Visualizations with user and action filters.
  • Excellent understanding of Data modeling (Dimensional & Relational) on concepts like Star-schema, Snowflake schema using fact and dimension tables and relational databases (SQL Server).
  • Ensured data accuracy and reliability of reports.
  • Created Bullet graphs to determine profit generation by using measures and dimensions data from Oracle, SQL Server and MS Excel.
  • Worked on Agile Scrum Methodology and designed high level ETL architecture for overall data transfer from the OLTP to EDW with the help of SSIS.
  • Successfully transferred inbound data from various sources like flat files, MS Access, and Excel into MS SQL Server 20012 using SSIS Packages.
  • Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding.
  • Defined corporate Metadata definitions for all Enterprise Data Supported databases (operational source systems, data stores and data marts).
  • Worked on troubleshooting, performance tuning and performances monitoring for enhancement of DataStage jobs and builds across Development, QA and PROD environments
  • Prepared files and reports by performing ETL, data extraction, and data validation, managed metadata and prepared Data dictionary as per the project requirement.

Environment: SQl Server 2012, Python 3.5, SAS, Informatica Analyst, 10.1.2, Tableau 9, R, Pycharm 3.2, Advance Excel, Macros, Vlookups, Pivot Tables, SSRS, SSIS

Confidential, New York, NY

Senior Data System Analyst

Responsibilities:

  • Involved in Analyzing, designing, building &, testing of OLAP cubes with SSAS 2008 and in adding calculations using MDX.
  • Extensively used Informatica Client tools- Source Analyzer, Warehouse Designer, Mapping Designer.
  • Developed Informatica mappings for data extraction and loading worked with Expression, Lookup, Filter and Sequence generator and Aggregator Transformations to load the data from source to target.
  • Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation.
  • Designed ETL Process using ETL tools (Informatica) and Implementation of Data Movement, Error Capturing & Reporting, Initial & Delta Load, Implemented Change Data Capture methodology.
  • Created DDL scripts for implementing Data Modeling changes. Created ERWIN crystal reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, coordinated with DBAs' to apply the data model changes.
  • Strong Data modeling experience using ER diagram, Dimensional data modeling, Conceptual/Logical/Physical Modeling using 3NormalForm (3NF), Star Schema modeling, Snowflake modeling using tools like Erwin, ER-Studio, SAP Power Designer.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
  • Performed data analysis and development for the Customer Offers and Targeting team, using SAS Enterprise Guide, SAS Grid, Unix and Proc SQL.
  • Extensively used ERWIN for REVERSE Engineering, FORWARD Engineering, SUBJECT AREA, DOMAIN, Naming Standards Document etc.
  • Performed System Study and Requirements Analysis, prepared Data Flow Diagrams, Entity Relationship Diagrams, Data Diagrams, Table Structures.
  • Understanding the OLAP processing for changing and maintaining the Warehousing, Optimizing Dimensions, Hierarchies and adding the Aggregations to the Cube.
  • Using Hadoop connector extracted Bank data from the source and loaded into Hadoop HDFS Data Lake.
  • Created data mapping documents to provide transformation rules to load the datafrom the source to target system and serve as an input to ETL jobs.
  • Designing SAS Visual Analytics (VA) reports using VA Data builder, VA Explorer, VA Designer, VA Viewer and creating custom reports in SAS VA.
  • Manipulated the data and modified the code wherever necessary using SAS functions and Proc SQL statements, formats, informats using SAS EG.
  • Good at Data Warehouse techniques -Dimensional data Modeling, Star Schema and Snowflake Schema
  • Designed and developed star schema, snowflake schema and created fact tables and dimension tables for the warehouse and data marts using Erwin.
  • Experienced in SQL Server Reporting Services (SSRS), SQL Server Integration Services (SSIS), Data Transform Services (DTS) and SQL Server Analysis Services (SSAS).
  • Extensive experience in Data Extraction, Transformation and Loading (ETL) using DTS package in MS SQL Server 2005/2008/2008 R2/2012 SQL tools like SSIS, Data Transformation Services (DTS).
  • Involved designing and developing packages for a Data Warehousing and Data Migrations projects using Integration services (SSIS) on different Data Sources.
  • Designed SSIS Packages to transfer data from various sources like Text Files, SQL Server, Excel, OLEDB and Access, .csv, SharePoint to SQL Server and automated those processes.
  • Worked on Agile Scrum Methodology and designed high level ETL architecture for overall data transfer from the OLTP to EDW with the help of SSIS.
  • Successfully transferred inbound data from various sources like flat files, MS Access, and Excel into MS SQL Server 20012 using SSIS Packages.
  • Developed SSIS tasks to download Encrypted files from remote SFTP server, decrypt and stage data in Staging DB.
  • Prepared XML, UNIX shell scripts, and tuned SQL scripts for DML operations for Oracle and DB2 databases and created and schedule Jobs on a regular basis for ETL load processes in EDW Data marts.
  • Re-designed the existing reporting data mart into a star schema to improve performance and provided support for the migration of BRIO reports to Business Objects.
  • Installed SQL Server 2012 and Management tools using SQL Server Setup Program.
  • Involved designing and developing packages for a Data Warehousing and Data Migrations projects using Integration services (SSIS) on different Data Sources.
  • Assisted the design and administration of the corporate Metadata Repository.
  • Involved in the creation, maintenance of Data Warehouse and repositories containing Metadata.
  • Assisted in the design and administration of the corporate Metadata Repository.
  • Defined corporate Metadata definitions for all Enterprise Data Supported databases (operational source systems, data stores and data marts).
  • Data modeling and design of for data warehouse and data marts in star schema methodology with Dimensions and Fact tables.
  • Provided guidance to ETL team to translate the data mapping document into a high level design document and also during the creation of ETL jobs
  • Designed and developed ETL processes using DataStage to load data from Teradata, Flat Files to staging database and from staging to the target Data Warehouse database.
  • Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding.
  • Defined corporate Metadata definitions for all Enterprise Data Supported databases (operational source systems, data stores and data marts).
  • Worked on troubleshooting, performance tuning and performances monitoring for enhancement of DataStage jobs and builds across Development, QA and PROD environments
  • Prepared files and reports by performing ETL, data extraction, and data validation, managed metadata and prepared Data dictionary as per the project requirement
  • Enhanced the old logical and physical database design to fit new business requirement, and implemented new design into SQL Server 2008.

Environment: SQL Server, Java, XHTML, CSS, JSP, Hadoop, Oracle 11g, Data Warehousing, Data Mart, Metadata, Data stage, Quality Stage, SAS,SQLServer Management Studio 2014, Visual Source Safe, HP Quality Center 10.0, Windows XP, Oracle 10g, and Control M.

Confidential, Chicago, IL

Business Data Analyst

Responsibilities:

  • Created various standard/reusable jobs in DataStage using various active and passive stages like Sort, Lookup, Filter, Join, Transformer, aggregator, Change Capture Data, Sequential file, Datasets.
  • Worked as a Data Modeler/Analyst to generate Data Models using Erwin and developed relational database system.
  • Involved in extracting the data using SSIS from OLTP to OLAP.
  • Worked on OLAP Data warehouse, Model, Design, and Implementation.
  • Created the conceptual model for the data warehouse using Erwin data modeling tool.
  • Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts and Decision Support Systems (DSS) using Multidimensional and Dimensional modeling (Star and Snowflake schema) Concepts
  • Expertise in OLAP/OLTP System Study, Analysis and E-R Modeling, developing database Schemas like Star Schema, Snowflake Schema, Conforming Dimensions and Slowly Changing Dimensions used in relational, dimensional and multi-dimensional modeling.
  • Interacted with business users and analysts and gathered and documented the technical and business metadata.
  • Conducted introductory and hands-on sessions of Hadoop HDFS architecture, Hive, Talend and Pig for other teams.
  • Worked on Informatica Data Integration Tools - Such as Repository Manager, Designer, Workflow Manager, Workflow Monitor and Scheduled workflows using Workflow Manager.
  • Worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads.
  • Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center
  • Delivered final source to target mapping and insert scripts to the Hadoop Developers.
  • Performed transformations using various SSIS tasks such as conditional split, derived column, that performed data scrubbing, including data validation checks during staging, before loading the data into the data warehouse.
  • Created Views to reduce database complexities for the end users.
  • Performed in depth analysis in data & prepared weekly, biweekly, monthly reports by using SQL, SAS, Ms Excel, Ms Access, and UNIX.
  • Developed Unit Test Plans for SSIS packages for checking the functionality of each component.
  • Scheduled Jobs for executing the stored SSIS packages/Master Packages which were developed to update the database on Daily/Weekly/Monthly basis using SQL Server Agent.
  • Involved in design, code, and deploy new data extractions using SSIS, and designed and produced documentation of data transformations for all extractions.
  • Provided guidelines and recommendations to the projects on their metadata, master data and reference data strategy and the selection of tools based on the approved set of tools within the EAD group.
  • Wrote reusable macros to automate the recurring reports in SAS, and also wrote Korn-Shell scripts for running SAS programs in batch mode on Unix
  • Define the list codes and code conversions between the Source Systems and the Data Mart.
  • Created the Dimensional Logical Data Model for Demand Deposit Accounts Data Mart using banking and Financial Markets Data Warehouse (BFMDW) 8.5 in companies InfoSphere Data Architect (IDA) tool.
  • Experienced in designing the data mart and creation of cubes.
  • Designed and developed DataStage ETL Parallel Jobs between Source and Target.
  • Extensive experience in working with Datastage Designer for developing jobs and Datastage Director to view the log file for execution errors.
  • Extensive experience in working with Datastage Designer for developing jobs and Datastage Director to view the log file for execution errors.
  • Designed and developed a customizable data management system using Hadoop to interface with the current RBAC system.
  • Developed SSIS tasks to download Encrypted files from remote SFTP server, decrypt and stage data in Staging DB.
  • Assist the various stakeholders by providing them with process and program related data as requested, design layout of reports and determine the best way to present data to end users
  • Worked on various tasks and transformations like Execute Sql task, Execute Package Task, Conditional split, Script Component, Merge and Lookup while loading the data into Destination
  • Performed transformations using various SSIS tasks such as conditional split, derived column, that performed data scrubbing, including data validation checks during staging, before loading the data into the data warehouse.
  • Used ETL to implement the slowly changing transformation to maintain historical data in data warehouse.
  • Implemented different Schema techniques like Snow Flakes and Star Schema
  • Performed Reverse Engineering of the current application using ERwin, and developed Logical and Physical data models for Central Model consolidation.
  • Involved in ER diagrams (Physical and Logical using Erwin) and data mapping the data into database objects.
  • Designed and created the Logical Data Model for the data mart
  • Good working knowledge of Meta-data management in consolidating metadata from disparate tools and sources including Data warehouse, ETL, Relational Databases and third-party metadata into a single repository to get information on data usage and end-to-end change impact analysis.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.
  • Metadata mapping of data when data are transformed from operational environment to data warehouse environment.
  • Worked on Agile Scrum Methodology and designed high level ETL architecture for overall data transfer from the OLTP to EDW with the help of SSIS.
  • Designed database, tables and views structure for the new data mart.
  • Successfully transferred inbound data from various sources like flat files, MS Access, and Excel into MS SQL Server 2012 using SSIS Packages.
  • Developed complex T-SQL code such as Stored Procedures, functions, triggers, Indexes and views for the application.

Environment: ETL, SQL, Java, XHTML, CSS, JSP, Oracle 11g, Data Warehousing, Data Mart, Metadata,, SQLDeveloper 8.0, Toad 9.5,SAS, HP Quality Center 10.0, Unix, Windows XP, Oracle 10g and Teradata.

Confidential

Data Reporting Consultant

Responsibilities:

  • Worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads.
  • Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center
  • Extensively used Informatica power Center 9.6.1 for ETL (Extraction, Transformation and Loading), date from relational tables.
  • Walked through the Informatica and Teradata code to identify protected information references of columns like SSN, Medicaid number, Last name and first name.
  • Involved in upgrading Informatica Power Center 9.5.1 to 9.6.1.
  • Worked with Dimensional Data warehouses in Star and Snowflake Schemas, created slowly changing (SCD) Type1/2/3 dimension mappings using Ralph Kimball methodology
  • Extensive experience in database activities like Data Modeling, Database Design & Development, Coding, Implementation, Maintenance and Performance Monitoring & Tuning using tools such as Index Tuning Wizard, SQL Profiler and Replication.
  • Involved as ETL developer during analysis, planning, design, development, and implementation stages of multiple projects like UCUES (UC Undergraduate Experience Survey) and UC Path using IBM Web Sphere DataStage.
  • Multiple stored procedures with transaction processing for different modules.
  • Used ERwin to transform data requirements into data models.
  • Good work experience in Informatica Data Integration Tools - Such as Repository Manager, Designer, Workflow Manager, Workflow Monitor and Scheduled workflows using Workflow Manager.
  • Metadata mapping of data when data are transformed from operational environment to data warehouse environment.
  • Expertise in SQL Server Analysis Services (SSAS) to deliver online analytical processing (OLAP) and data mining functionality for business intelligence applications.
  • Experience in data transformation, data mapping from source to target database schemas, data cleansing procedures.
  • Creation of SSIS packages to request a dump of weekly page views from Staples databases to our databases.
  • Developed SSIS packages not only for Initial Data Loads but also for incremental/delta loads.
  • Created a Log tables to capture the SSIS events and used it to Perform Debugging of packages.
  • Created and scheduled a SSIS package to run our stored procedures to create the output report tables and regenerate reports automatically on weekly or monthly bases using SSRS 2005.
  • Developed and supported the Extraction, Transformation and Load process (ETL) for a data warehouse from various data sources using DataStage Designer
  • Developed SSIS packages not only for Initial Data Loads but also for incremental/delta loads.
  • Created a Log tables to capture the SSIS events and used it to Perform Debugging of packages.

Environment: ETL, java, HTML, Java Script, Erwin, SQL Server 2008, Data Warehousing, Data Mart, Metadata, UNIX, SQL Developer 8.0, Toad 9.5, HP Quality Center 10.0,SAS, UNIX, Windows XP, Oracle 10g and Teradata.

Confidential

SQL Developer/Reporting Analyst

Responsibilities:

  • Interact with development team to identify data needs and potential issues as well as help design and implement SQL queries and reports
  • Participate in team meetings to plan, review and design solutions to Business’s needs.
  • Provide ad hoc reports for department heads.
  • Promote and adhere to company database standards for security, coding, templates, connectivity and source control.
  • Expertise in programming complex SQL queries, SSRS, stored procedures, views, user defined functions and triggers.
  • Assist maintaining multiple Database and ancillary systems, conducts scheduled maintenance and monitors them for optimal performance.
  • Assist diagnosing database issues.
  • Help support users of all databases by providing training as required.
  • Provides backup support for the Database Administrator and Report Developer.
  • Works with staff and clients to resolve any report issues. Provides basic report filter customization.

We'd love your feedback!