We provide IT Staff Augmentation Services!

Sr. Data Architect / Modeler Resume

Boca Raton, FL

SUMMARY:

  • 7+ years in Information Technology with Expertise in Data Architect & Data Modeling for Data Warehouse/Data Mart development, Data Analysis for Online Transaction Processing (OLTP) and Data Warehousing (OLAP)/Business Intelligence (BI) applications.
  • Experience in Data Modeling / Analysis for Enterprise Data Warehouse (OLAP), Master and Data Management (MDM), and OLTP systems.
  • Including Microstrategy Desktop/Developer, Web, Architect, OLAP Services, Administrator and Intelligence server.
  • Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.
  • Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
  • Strong experience in Data Analysis, Data Migration, Transformation, Integration, Data Import, and Data Export through ETL tools.
  • Hands - on experience on Agile and Waterfall Software Development Methodologies.
  • Hands-on experience in architecting and data modeling for AWS Platform including AWS Redshift. Oracle RDS, PostgreSQL RDS and Aurora.
  • Expertise in configuring the monitoring and alerting tools according to the requirement like AWS CloudWatch.
  • Extensive knowledge of Bill Inmon (Enterprise Data Warehouse) and Ralph Kimball (Data Marts) methodologies, Database Design Methodologies (Normalization and De-Normalization).
  • Expert in Conceptual, Logical and Physical Data Modeling for various platforms including Oracle, DB2, Teradata, PostgreSQL, SQL Server.
  • Experience in developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • Hands-on experience as Procedural DBA using Oracle toolset (PL/SQL, SQL, Performance Tuning).
  • Experience in ETL techniques and Analysis and Reporting including working experience with the Reporting tools such as Tableau, Informatica and Ab initio.
  • Worked in integration of SalesForce and SQL server using Sql Server Integration Services
  • Experience in Developing Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
  • Experienced in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
  • Experience in Business Intelligence (BI) project Development and implementation using Microstrategy product suits
  • Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
  • Experience in developing MapReduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
  • Extensive use of Talend ELT, database, data set, HBase, Hive, PIG, HDFS and SCOOP components
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
  • Strong background in various Data Modeling tools using ERWIN, IBM Data Architect, Power Designer, MS Visio.
  • Experience with relational (3NF) and dimensional data architectures. Experience in leading cross-functional, culturally diverse teams to meet strategic, tactical and operational goals and objectives.
  • Extensive experience in Relational Data Modeling, Logical data model/Physical data models Designs, ER Diagrams, Forward and Reverse Engineering, Publishing ERWIN diagrams, analyzing data sources and creating interface documents.
  • Experience in working with business intelligence and data warehouse software, including SSAS, Pentaho, Cognos, OBIEE, Greenplum Database, Amazon Redshift and Azure Data Warehouse.
  • Excellent experience in developing Stored Procedures, Triggers, Functions, Packages, Inner Joins & Outer Joins, views using TSQL/PLSQL.
  • Experience in designing error and exception handling procedures to identify, record and report errors.
  • Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse projects.
  • Excellent knowledge on creating reports on SAP Business Objects, Webi reports for multiple data providers.
  • Experience in Data transformation and Data mapping from source to target database schemas and also data cleansing.
  • Good understanding and hands on experience with AWS S3 and EC2.
  • Experience in Ralph Kimball and Bill Inmon approaches.
  • Experience in migration of data from Excel, DB2, Sybase, Flat file, Teradata, Netezza, Oracle to MS SQL Server using BCP and DTS utility and extracting, transforming and loading of data.
  • Experience in using Oracle, SQL*PLUS , and SQL*Loader .
  • Experience in automating and scheduling the Informatica jobs using UNIX shell scripting configuring Korn-jobs for Informatica sessions.

TECHNICAL SKILLS:

Data Modeling Tools: Rational System Architect, IBM Infosphere Data Architect, Erwin 9.7, E/R Studio 17, Power Designer and Oracle SQL Developer.

ETL/Data warehouse Tools:: Informatica 9.6/9.1/8.6.1/8.1 , SAP Business Objects XIR3.1/XIR2, Web Intelligence, Talend, Pentaho.

Big Data Technology: MapReduce, HBase 1.2, HDFS, Sqoop 1.4, Hadoop 3.0, Hive 2.3, PIG.

Database Tools: Microsoft SQL Server 16.0, Teradata 16.0, Oracle 12c/11g and MS Access.

Cloud Platforms: Amazon EC2, EC3, Elastic Search, Elastic Load Balance.AWS

Version Tool: VSS, SVN, CVS.

OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports.

Operating System: Windows, UNIX, Sun Solaris.

Packages: Microsoft Office 2016, Microsoft Project 2016, SAP and Microsoft Visio, Share point Portal Server.

Programming Languages: SQL, PL/SQL, HTML5 and XML.

Methodologies: Agile, Ralph Kimball, BillInmon s data warehousing methodology, RUP, RAD and JAD.

Programming Languages: Oracle 12c, PL/SQL, UNIX, Shell Scripting

WORK EXPERIENCE:

Confidential, Boca Raton, FL

Sr. Data Architect / Modeler

Responsibilities:

  • Developed long term data warehouse roadmap and Architectures, Designs and builds the data warehouse framework per the roadmap.
  • Massively involved in Data Architecture/ Modeler role to review business requirement and compose source to target data mapping documents.
  • Architecting database infrastructure, development and implementation. Exercising technical leadership in creation of database infrastructure including planning, procurement, installation, configuration, integration and implementation.
  • Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning and ETL.
  • Working on data profiling and analysis to create test cases for new Architecture evaluation.
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Involved in end to end implementation of Big data design.
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide Sql access on Hadoop data.
  • Implemented Installation and configuration of multi-node cluster on Cloud using Amazon Web Services (AWS) on EC2.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from NoSQL and a variety of portfolios.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Extensively used Erwin r9.7 for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
  • Extensively used agile methodology as the Organization Standard to implement the data Models.
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirement.
  • Created ETL/Talend jobs both design and code to process data to target databases.
  • Working on several ETL Ab Initio assignments to perform extract, transform and load data into Teradata and Oracle databases which had complex data models of Relational, Star and Snowflake schemas.
  • Working on SSIS development using meta-data driven Architecture.
  • Strong ability in developing advanced ANSI SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted.
  • Define metadata business-level (logical) terms through interactions with project teams, business subject matter experts, and data analysis.
  • Identified and streamlined complex queries which were causing iterations and effecting database and system performance.
  • Coordinated with the Business Analyst and prepared Logical and Physical Data-models as per the requirements involving multiple subject areas, domains and naming standards.
  • Created Data stage jobs (ETL Process) for populating the data into the Data Warehouse constantly from different source systems like ODS, flat files, scheduled the same using Data Stage for System Integration testing.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
  • Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
  • Worked on AWS and architecting a solution to load data creates data models and run BI on it.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSql, and Big Data technologies.
  • Working on AWS Glue cloud, python and pyspark programming language.
  • Worked with the ETL team to document the SSIS packages for data extraction to Warehouse environment for reporting purposes.
  • Used Talend for Big data Integration using Spark and Hadoop.
  • Documented ER Diagrams, Logical and Physical models, business process diagrams and process flow diagrams.
  • Used SSRS to create reports, customized Reports, on-demand reports, ad-hoc reports and involved in analyzing multi-dimensional reports in SSRS.
  • Presented the data scenarios via, Erwin logical models and excel mockups to visualize the data better.
  • Responsible to create conceptual, logical and physical data models, of disparate information for report development.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Conducted and participated in JAD sessions with the users, modelers, and developers for resolving issues.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.
  • Worked with various Teradata15 tools and utilities like Teradata Viewpoint, Multi Load, ARC, Teradata Administrator, BTEQ and other Teradata Utilities.
  • Created one framework in data stage with all tables
  • Created DDL scripts using Erwin and source to target mappings to bring the data from source to the warehouse.
  • Developed and configured on Informatica MDM hub supports the Master Data Management (MDM), Business Intelligence (BI) and Data Warehousing platforms to meet business needs.
  • Proficient in SQL across a number of dialects (we commonly write MySQL, PostgreSQL, Sql Server, and Oracle)
  • Worked with reverse engineering Data Model from Database instance and Scripts.
  • Metadata integration across Enterprise Data dictionary, Catalogs, Data Models and Informatica establishing lineage and traceability.

Environment: Erwin 9.7, HBase 1.2, NoSQL, OLTP, OLAP, Teradata 15, Netezza, SQL Architect, MySQL, Oracle 12c, AWS, Hive 2.3, HDFS, Informatica, Snow- Flake, Star Schema, Sql, Hadoop, SSIS .

Confidential, Minneapolis, MN

Sr. Data Analyst / Modeler

Responsibilities:

  • Responsible for Data Analysis, Data Modeling, Data Integration, Data quality & Metadata management solution design and delivery for Enterprise EDW environment.
  • Worked very close with Data Architectures and DBA team to implement data model changes in database in all environments.
  • Involved in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams with all related entities and relationship with each entity based on the rules provided by the business manager using ER Studio.
  • Involved in extensive DATA validation using ANSI SQL queries and back-end testing
  • Worked on Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon's Data Warehouse methodology.
  • Worked on Data Stage admin activities like creating ODBC connections to various Data sources, Server Start up and shut down, Creating Environmental Variables, Creating Data Stage projects.
  • Participated in all phases of project including Requirement gathering, Architecture, Analysis, Design, Coding, Testing, Documentation and warranty period.
  • Responsible for delivering and coordinating data-profiling, data-analysis, data-governance, data-models (conceptual, logical, physical), data-mapping, data-lineage and data management.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow Flake Schemas.
  • Generate metadata, create Talend etl jobs, mappings to load data warehouse, data lake
  • Used forward engineering to generate DDL from the Physical Data Model and handed it to the DBA.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval and designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Involved in Planning, Defining and Designing database using ER Studio on business requirement and provided documentation.
  • Worked with BTEQ to submit Sql statements, import and export data, and generate reports in Teradata.
  • Developed Full life cycle of Data Lake, Data Warehouse with Big data technologies like Spark and Hadoop.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Responsible for all metadata relating to the EDW's overall data architecture, descriptions of data objects, access methods and security requirements.
  • Validated and reviewed solutions and effort estimate for data center migration to Azure Cloud
  • Transitioned new technical projects ensuring smooth go-live to Azure operations
  • Conducted walkthroughs with the Business Analysts, Development teams and DBA to convey the changes made to the data models.
  • Used Agile Methodology of Data Warehouse development using Kanbanize.
  • Worked with DBA group to create Best-Fit Physical Data Model from the Logical Data Model using Forward Engineering.
  • Worked with NoSQL databases like HBase in creating HBase tables to load large sets of semi-structured data coming from various sources.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Involved in Data Profiling, Data cleansing and make sure the data is accurate and analyzed when it is transferring from OLTP to Data Marts and Data Warehouse.
  • Worked on SQL Server concepts SSIS (SQL Server Integration Services), SSAS (Analysis Services) and SSRS (Reporting Services).
  • Generated and DDL (Data Definition Language) scripts using ER Studio and assisted DBA in Physical Implementation of Data Models.
  • Conducted Proof of Concept for Latest Azure cloud-based service
  • Completed enhancement for MDM (Master data management) and suggested the implementation for hybrid MDM (Master Data Management).
  • Exported data from HDFS environment into RDBMS using Sqoop for report generation and visualization purpose.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct Data Analysis.
  • Performed Data Analysis, Data Migration and data profiling using complex SQL on various sources systems including Oracle and Teradata
  • Designed and documented Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object Oriented Design) using UML and Visio.
  • Developed Linux Shell scripts by using Nzsql/Nzload utilities to load data from flat files to Netezza database.
  • Validated the data of reports by writing SQL queries in PL/SQL Developer against ODS.
  • Involved in user sessions and assisting in UAT (User Acceptance Testing).
  • Worked with Data governance, Data quality, data lineage, Data architect to design various models and processes.

Environment: ER Studio, Azure, OLTP, Teradata r15, Sqoop 1.4, Cassandra 3.11, MongoDB 3.6, HDFS, Linux, Shell, scripts, NoSQL, SSIS, SSAS, HBase 1.2, MDM.

Confidential, Newark, NJ

Sr. Data Analyst Modeler

Responsibilities:

  • Worked as a Data Modeler/Analyst in Data Architecture Team and responsible for Conceptual, Logical and Physical model for Supply Chain Project.
  • Involved in logical and Physical Database design & development, Normalization and Data modeling using Erwin and Sql Server Enterprise manager.
  • Involved in Data Architecture, Data profiling, Data analysis, data mapping and Data architecture artifacts design.
  • Worked closely with Business analysts, data architects and various teams to understand the requirements and to translate them into appropriate database designs.
  • Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
  • Prepared ETL technical Mapping Documents along with test cases for each Mapping for future developments to maintain Software Development Life Cycle (SDLC).
  • Extensively worked on creating the migration plan to Amazon web services (AWS).
  • Extracted Mega Data from Amazon Redshift, AWS, and Elastic Search engine using Sql Queries to create reports.
  • Designed OLTP system environment and maintained documentation of Metadata.
  • Created dimensional model for the reporting system by identifying required dimensions and facts using
  • Used Reverse Engineering to connect to existing database and create graphical representation (E-R diagram)
  • Using Erwin modeling tool, publishing of a data dictionary, review of the model and dictionary with subject matter experts and generation of data definition language.
  • Coordinated with DBA in implementing the Database changes and also updating Data Models with changes implemented in development, QA and Production.
  • Created and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.
  • Developed the Talend mappings using various transformations, Sessions and Workflows. Teradata was the target database, Source database is a combination of Flat files, Oracle tables, Excel files and Teradata database.
  • Worked Extensively with DBA and Reporting team for improving the Report Performance with the Use of appropriate indexes and Partitioning
  • Extensive experience in PL/Sql programming Stored Procedures, Functions, Packages and Triggers
  • Data modeling in Erwin; design of target data models for enterprise data warehouse (Teradata)
  • Designed and Developed Oracle, PL/Sql, Procedures, Linux and Unix Shell Scripts for data Import/Export and data Conversions.
  • Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.
  • Automated and scheduled recurring reporting processes using UNIX shell scripting and Teradata utilities such as Mload, BTEQ and Fast Load
  • Participated in all phases including Analysis, Design, Coding, Testing and Documentation.
  • Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
  • Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning.
  • Created and maintained data model standards, including master data management (MDM) and Involved in extracting the data from various sources like Oracle, Sql, Teradata, and XML.
  • Worked with medical claim data in the Oracle database for Inpatient/Outpatient data validation, trend and comparative analysis.
  • Used Load utilities (Fast Load & Multi Load) with the mainframe interface to load the data into Teradata.
  • Optimized and updated UML Models (Visio) and Relational Data Models for various applications.
  • Experienced with BI Reporting in Design and Development of Queries, Reports, Workbooks, Business Explorer Analyzer, Query Builder, Web Reporting.
  • Generated various reports using Sql Server Report Services (SSRS) for business analysts and the management team.

Environment: Erwin9.0, Oracle11g, Sql Server 2010, Teradata14, XML, OLTP, PL/Sql, Linux, UNIX, Mload, BTEQ, UNIX shell scripting

Confidential, New York City, NY

Data Analyst/Modeler

Responsibilities:

  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Conduct Design discussions and meetings to come out with the appropriate Data Warehouse at the lowest level of grain for each of the Dimensions involved.
  • Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
  • Construct complex SQL queries with sub-queries, inline views as per the functional needs in the Business Requirements Document (BRD).
  • Created SSIS package for daily email subscriptions to alert Tableau subscription failure using the ODBC driver and PostgreSQL database.
  • Designed logical and physical data models, Reverse engineering, Complete compare for Oracle and SQL server objects using ER Studio.
  • Used Talend to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
  • Worked with supporting business analysis and marketing campaign analytics with data mining, data processing, and investigation to answer complex business questions.
  • Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using ER Studio Data Modeler.
  • Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
  • Planned and defined system requirements to Use Case, Use Case Scenario and Use Case Narrative using the UML (Unified Modeling Language) methodologies.
  • Worked with Business Analysts team in requirements gathering and in preparing functional specifications and translating them to technical specifications.
  • Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
  • Developed Data mapping, Data Governance, Transformation and Cleansing rules for the Data Management involving OLTP, ODS and OLAP.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Normalized the database based on the new model developed to put them into the 3NF of the data warehouse.
  • Used SQL tools like Teradata SQL Assistant and TOAD to run SQL queries and validate the data in warehouse.
  • Gather all the analysis reports prototypes from the business analysts belonging to different Business units; Participated in JAD sessions involving the discussion of various reporting needs.
  • Reverse Engineering the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
  • Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
  • Involved in designing and developing SQL server objects such as Tables, Views, Indexes (Clustered and Non-Clustered), Stored Procedures and Functions in Transact-SQL.
  • Developed scripts that automated DDL and DML statements used in creations of databases, tables, constraints, and updates.
  • Ensured the feasibility of the logical and physical design models.
  • Worked on the Snow-flaking the Dimensions to remove redundancy.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using ERStudio.

Environment: PL/SQL, ER Studio, MS SQL 2008, OLTP, ODS, OLAP, OLTP, ODS, OLAP, SSIS, Tableau, ODBC, Transact-SQL, TOAD, Teradata SQL Assistant

Confidential

Data Analyst

Responsibilities:

  • Involved in Business and Data analysis during requirements gathering.
  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Created conceptual, logical and physical data models using best practices and company standards to ensure high data quality and reduced redundancy.
  • Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirement.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Project involves production, test, and development administration and support for client's existing DB2 UDB platform running DB2 UDB v9.1 and v8.2 on servers under various operating system.
  • Performed Data analysis for the existing Data warehouse and changed the internal schema for performance.
  • Used MS Visio for business flow diagrams and defined the workflow.
  • Worked and extracted data from various database sources like DB2, CSV, XML and Flat files into the DataStage.
  • Analyzed user requirements & worked with data modelers to identify entities and relationship for data modeling.
  • Developed Sql queries for Extracting data from production database and built data structures and reports.
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN
  • Used forward engineering to create a physical data model with DDL that best suits the requirements from the Logical Data Model.
  • Developed complex T-Sql code such as Stored Procedures, functions, triggers, Indexes, and views for the business application.
  • Involved in complete SSIS life cycle in creating SSIS packages, building, deploying and executing the packages all environments. (QA, Development and Production)
  • Created SSIS Packages for migration of data to MS Sql Server database from other databases and source like Flat Files, MS Excel, Sybase, CSV files.
  • Developed reports for users in different departments in the organization using SQL Server Reporting Services (SSRS).
  • Performed in depth analysis in data & prepared weekly, biweekly, monthly reports by using MS Excel, MS Access, SQL, and UNIX
  • Extracted data from existing data source and performed ad-hoc queries by using SQL and UNIX.
  • Actively participated in the design of data model like conceptual, logical models using Erwin.
  • Performed logical data modeling, physical data modeling (including reverse engineering) using the Erwin Data Modeling tool.
  • Used and supported database applications and tools for extraction, transformation and analysis of raw data
  • Understood business processes, data entities, data producers, and data dependencies
  • Developed and programmed test scripts to identify and manage data inconsistencies and testing of ETL processes
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Used the MS Excel, MS Access for data pulls and ad hoc reports for analysis.
  • Performance tuning and Optimization of large database for fast accessing data and reports Ms Excel.
  • Created pivot tables and charts using worksheet Data and external resources, modified pivot tables, sorted items and group Data, and refreshed and formatted pivot tables,
  • Performed in depth analysis in data & prepared weekly, biweekly, monthly reports by using Sql, SAS, Ms Excel, Ms Access, and UNIX.
  • Performed data analysis and data profiling using complex SQL on various sources systems.
  • Created Sql scripts to find data quality issues and to identify keys, data anomalies, and data validation issues.

Environment: Erwin 8.0, MS Excel, MS Access, UNIX, T-Sql, MS Sql 2008, Excel, SSIS, SSRS, MS Visio.

Hire Now