We provide IT Staff Augmentation Services!

Sr. Data Analyst/data Modeler Resume

Minneapolis, MN

SUMMARY:

  • 9+ years of strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Governance, Data Integration, MDM, NoSQL and Metadata Management Services and Configuration Management.
  • Worked with Technical Architects and Database analysts for the Design of Summary tables required for efficient Report Design.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
  • Experience with emerging technologies such Big Data, Hadoop, and NoSQL.
  • Expert in the Data Analysis, Design, Development, Implementation and Testing using Data Conversions, Extraction, Transformation and Loading (ETL) and SQL Server, ORACLE and other relational and non - relational databases.
  • Strong experience in using Excel and MSAccess to dump the data and analyze based on business needs.
  • Focus on Organization, Client Relations and Process Improvements
  • Proficient in Gathering Requirements, Business Case Development to Rollout, Production and Maintenance with a solid understanding of Business Processes and analyzing them and documenting.
  • Managed full SDLC processes involving requirements management, workflow analysis, source data analysis, data mapping, metadata management, data quality, testing strategy and maintenance of the model.
  • Understanding in development of Conceptual, Logical and Physical Models for Online Transaction Processing and Online Analytical Processing (OLTP&OLAP).
  • Experience with emerging technologies such Big Data, Hadoop, and NoSQL.
  • Skillful in Data Analysis using SQL on Oracle, MSSQL Server, DB2&Teradata
  • Experience in logical/physical database design and review sessions to determine and describe data flow and data mapping from source to target databases coordinating with End Users, Business Analysts, DBAs and Application Architects.
  • Ability to learn technical information and keep self-updated with technology.
  • Expertise in scheduling JAD (Joint Application Development) with End Users, stake Holders, Subject Matter Experts, Developers and Testers.
  • Good exposure on usage of NoSQL database.
  • Experience in analyzing data using Hadoop Ecosystem including HDFS, Hive, Spark, Spark Streaming, Elastic Search, Kibana, Kafka, HBase, Zookeeper, PIG, Sqoop, Flume.
  • Experienced in designing and developing project document templates based on SDLC methodology.
  • Expert in Visio, Process Flow Diagrams, Activity Diagrams, Cross Functional Diagram, Swim Lane Diagrams, Use Case Diagrams.
  • Experienced in developing testing strategies plans, writing testcases and UAT.
  • Collaborative and decisive with strong communication and interpersonal abilities.
  • Analytical and Customer-oriented with strategic planning and Quality leadership.
  • Flexible, enthusiastic and project oriented team player with excellent written, verbal communication and leadership skills to develop creative solutions for challenging client needs.
  • Quick learner and good performer both in team and independent job environments.

TECHNICAL SKILLS:

Database Tools: Microsoft SQL Server, Teradata, Oracle 12c/11g/9i, and MS Access, Netezza.

Programming Languages: (SSIS, SSRS, SSAS), C Lang, DS through C

Big Data: PIG, Hive, HBase, Spark, Sqoop, Flume

Tools: OBIE 10g/12g, EHP, Gotomeeting, Docusign, Insidesales.com, Sharepoint, Mat-lab.

Design and Development: Visual Studio,Agile

Platforms: MAC OS X,, Linux

Office: Office Google Forms, MS Project, MS Word, MS Excel (Formulas, Pivot tables), MS PowerPoint, Outlook, Visio, MS Access

AWS: Redshift, EMR, EC2, S3, RDS, Cloud Search, Data Pipeline

Operating Systems: Windows, XP, Linux, Unix.

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Waterfall Model.

PROFESSIONAL EXPERIENCE:

Confidential, Minneapolis, MN

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Worked with Erwin, Creating Conceptual, Logical and Physical data models.
  • Performed Data Profiling and Data Quality and created various data quality rules
  • Part of the team responsible for the analysis, design and implementation of the business solution.
  • Demonstrated strong analytical skills in identifying and resolvingdataexchange issues.
  • Executed SQLqueries to retrievedatafrom databases for analysis.
  • Involved in creating Physical and Logical models using Erwin.
  • Worked on building the data model using Erwin as per the requirements. Designed the grain of facts depending on reporting requirements.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Collected large amounts of log data using Apache Flume and aggregating using PIG/HIVE in HDFS for further analysis.
  • Involved with Data Analysis Primarily Identifying Data Sets, Source Data, Source Meta Data, Data Definitions and Data Formats.
  • Expertise and worked on Physical, logical and conceptual data model
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL-like access on Hadoop data
  • Implementation of full lifecycle inDatawarehouses and BusinessDatamarts with Star Schemas, Snowflake Schemas, SCD &DimensionalModeling.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle 10g/11g and Teradata.
  • Designed different type of STAR schemas like detaileddatamarts and Plandatamarts, Monthly Summarydatamarts using ER studio with various Dimensions Like Time, Services, Customers and various FACT Tables.
  • Involved in preparing LogicalDataModels/PhysicalDataModels.
  • Identify source systems, their connectivity, related tables and fields and ensuredatasuitably for mapping.
  • Worked with Teradata and/or Oracle in a multi terabytes environment
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL-like access on Hadoop data
  • Designed & Developed logical & physical data model using data warehouse methodologies, including Star schema - Star-joined schemas, conformed dimensions data architecture, early/late binding techniques, data modeling, designing & developing ETL applications using Informatica.
  • Selecting the appropriate AWS service based on data, compute, database, or security requirements.
  • Implemented Big Data landscape, specifically hands on working with Hadoop (Hortonworks), Working knowledge of open source tools such as: Hadoop technology (Hadoop, Sqoop, Hive, etc.)
  • Extensive hands-on technical knowledge of Data Warehouse Architecture, ETL, Database & SQL performance tuning
  • Prepared Data modeling and Data Mapping document for data elements and data cleansing that were used for Current and New system in MS-Excel.

Environment: ERWIN, Teradata, Oracle, DB2, SQL server2014, MS Visio, MS Outlook, MS Office Suite, MS Project, MS Excel, MS Word, Windows Server, UNIX, Sybase, Business objects, DB2, SQL server, PL/SQL

Confidential, Newark, NJ

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Generated DDL scripts using Forward Engineering technique to create objects and deploy them into the databases.
  • Reverse engineered existing database schemas in order to implement changes and updates for existing tables and views.
  • Experienced in Designing Database with prominent activities like maintaining sequences, index, and primary key, foreign key, manipulating columns and tables.
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Developed SQL, BTEQ (Teradata) queries for Extracting data from production database and built data structures, reports.
  • Performed Data modeling using TOAD Data Modeler. Identified objects and relationships and how those all fit together as logical entities, these are then translated into physical design using forward engineering TOAD Data Modeler tool.
  • Worked with BTEQ to submit SQL statements, import and export data, and generate reports in Teradata.
  • Designed and documented Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object Oriented Design) using UML and Visio.
  • Responsible for Dimensional Data Modeling and Modeling Diagrams using ERWIN.
  • Designed and implemented a Data Lake to consolidate data from multiple sources, using Hadoop stack technologies like SQOOP, HIVE/HQL.
  • Performeddatacleaning anddatamanipulation activities using NZSQL utility.
  • Excellent knowledge and experience in Technical Design and Documentation.
  • Designed and Developed Oracle PL/SQL Procedures and UNIX Shell Scripts for Data Import/Export and Data Conversions.
  • Installing and configuring the a 3-node Cluster in AWS EC2 Linux Servers.
  • Performed data cleaning and data manipulation activities using NOSQL utility.
  • Developing the Conceptual Data Models, Logical data models and transformed them to creating schema using ERWIN.
  • Worked with Amazon Redshift and AWS
  • Worked with Hadoop eco system covering HDFS, HBase, YARN and Map Reduce
  • Written complex SQLqueries for validating the data against different kinds of reports generated by Business Objects XIR2
  • In depth analyses of data report was prepared weekly, biweekly, monthly using MSExcel, SQL&UNIX.
  • Performance Tuning (Database Tuning, SQL Tuning, Application/ETL Tuning)
  • Create and alter SQL statements before sending database change request to DBA team.
  • Maintained and documented all create and alter SQL statements for all release.

Environment: UNIX, Sybase, Business objects, DB2, SQL server, PL/SQL, ERWIN, Teradata, Oracle DB2, SQL server, MS Visio MS Outlook, MS Office Suite, MS Project, MS Excel, MS Word, Windows Server

Confidential, Atlanta, GA

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Performeddataanalysis and profiling of sourcedatato better understand the sources.
  • Map business needs/requirements to subject area model and to logical enterprise model.
  • Worked with DBA's to create a best fit physicaldatamodel from the logicaldatamodel
  • Experience working on creating models for Teradata masterdatamanagement.
  • Involved in extensiveDataAnalysis on the Teradata and Oracle Systems querying and writing in SQL and TOAD.
  • Created logicaldatamodel from the conceptual model and it's conversion into the physical database design using ERWIN.
  • Involved in Performance Tuning of the database which included creating indexes, optimizing SQL statements & monitoring the server.
  • Developed the logicaldatamodels and physicaldatamodels that confine existing condition/potential statusdatafundamentals anddataflows using ER-Studio.
  • Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and Tested the testing programs
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Toad, MSAccess, Excel, XLS and SQL.
  • Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata as per business requirements using Erwin 9.5
  • Redefined many attributes and relationships in the model and cleansed unwanted tables/columns as part ofDataAnalysis responsibilities.
  • Assist developers, ETL, BI team and end users to understand thedatamodel.

Environment: SQL Server 2005/2008, DB2, Microsoft Transaction Server, Erwin r8.2, Teradata 14, Crystal Reports, MS-SQL server manager, Microsoft Access, Requisite Pro, MS Excel, MS Visio, Rational Rose, MS Office Suite, Agile techs, SQL Server 2008, Oracle 10g/9i

Confidential, NYC NY

Data Analyst/Data Modeler

Responsibilities:

  • DataAnalysis primarily IdentifyingDataSets, SourceData, Source MetaData,DataDefinitions and DataFormats.
  • Identified the entities and relationship between the entities to develop Conceptual Model using ERWIN.
  • Performeddataanalysis anddataprofiling using complex SQL on various sources systems including Oracle and Netezza
  • Developed ER and Dimensional Models using Power Designer advanced features. Created physical and logicaldatamodels using Power Designer.
  • Managed full SDLC processes involving requirements management, workflow analysis, source data analysis,datamapping, metadata management,dataquality,
  • PerformedDataprofiling to identifydataissues upfront, provided SQL prototypes to confirm the business logic provided prior to the development.
  • Responsible for the development and maintenance of Logical and Physical data models, along with corresponding metadata, to support Applications.
  • Resolved thedatatype inconsistencies between the source systems and the target system using the Mapping Documents.
  • Experience inDataTransformation andDataMapping from source to target database schemas and alsodatacleansing.
  • Analyzed business requirements, system requirements, data mapping requirement specifications with power designing, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
  • Applieddatacleansing/datascrubbing techniques to ensure consistency amongstdatasets.
  • Analyzed the physicaldatamodel to understand the relationship between existing tables. Cleansed the unwanted tables and columns as per the requirements as part of the duty being aDataAnalyst.
  • Worked on two sources to bring in requireddataneeded for reporting for a project by writing SQL extracts.

Environment: Teradata 13.x, Erwin 7.2, Unix, Angile, PL/SQL, MS-Access, Oracle 10g/9i, Sql Server 2008/2005,, MS Office, MS Visio, Crystal Reports, Quality Center 9.2, MS Excel 2007, Informatica.

Confidential

Data Analyst/Data Modeler

Responsibilities:

  • Analysis the metric dashboard reports and identified the formulas and functionality of the dashboard reports and metric dashboards in SAP Business Objects
  • Worked on data mapping process from source system to target system. Created dimensional model for the reporting system by identifying required facts and dimensions using Erwin.
  • Involved in Migrating thedatamodel from one database to Teradata database and prepared a Teradata staging model.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
  • Wrote and executed SQL queries to verify thatdatahas been moved from transactional system to DSS,Datawarehouse,datamart reporting system in accordance with requirements.
  • Identified and Defined entities, relationships and attributes in thedatamodel as per new specifications in Erwin after analyzing the database systems currently in use. Identified source elements and done mapping with the target systems.
  • Resolveddataissues and updates for multiple applications using SQL queries/scripts
  • Analysis of functional and non-functional categorizeddataelements fordataprofiling and mapping from source to targetdataenvironment. Developed working documents to support findings and assign specific tasks
  • Conducted theDataAnalysis and identified theDataquality issues usingDataprofiling methodologies.
  • Involved withDataAnalysis primarily IdentifyingDataSets, SourceData, Source MetaData,Data Definitions andDataFormats
  • Developed the requireddatawarehouse model using Star schema for the generalized model.
  • Createddatamasking mappings to mask the sensitivedatabetween production and test environment.

Environment: Teradata, Linux, Oracle9i/10g, PL/SQL, SQL Server 2005/2000, MS Access, Perl, Metadata, NZSQL,Erwin6.0, Toad 7.6, Informatica 7.0, DB2 V7.1

Hire Now