We provide IT Staff Augmentation Services!

Sr. Data Architect/ Modeler Resume

Minneapolis, MN


  • 8+ years of experience in Data Architect and Data Modelling, Data Development, Implementation and Maintenance of databases and software applications.
  • Experience of working with Azure Monitoring, Data Factory, Traffic Manager, Service Bus, Key Vault .
  • Designed and developed Cloud Service projects and deployed to Web Apps, PaaS, and IaaS
  • Configured SQL Server Master Data Services (MDS) in Windows Azure IaaS .
  • Working Experience on Azure Storage, SQL Azure and also in different PaaS Solutions with Web, and worker Roles and Azure Web Apps.
  • Experience in Dimensional Data Modeling, Star / Snowflake schema, FACT & Dimension tables.
  • Strong experience in Data Analysis, Data Migration, Transformation, Integration, Data Import, and Data Export through ETL tools.
  • Experience in integration of SalesForce and SQL server using Sql Server Integration Services
  • Develop Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
  • Experienced in Technical consulting and end - to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
  • Excellent experience in developing Stored Procedures, Triggers, Functions, Packages, Inner Joins & Outer Joins, views using TSQL/PLSQL.
  • Hands-on experience on Agile and Waterfall Software Development Methodologies.
  • Hands-on experience in architecting and data modeling for AWS Platform including AWS Redshift. Oracle RDS, PostgreSQL RDS and Aurora.
  • Extensive knowledge of Bill Inmon (Enterprise Data Warehouse) and Ralph Kimball (Data Marts) methodologies, Database Design Methodologies (Normalization and De-Normalization).
  • Expert in Conceptual, Logical and Physical Data Modeling for various platforms including Oracle, DB2, Teradata, PostgreSQL, SQL Server.
  • Hands-on experience as Procedural DBA using Oracle toolset (PL/SQL, SQL, Performance Tuning).
  • Experience in ETL techniques and Analysis and Reporting including working experience with the Reporting tools such as Tableau, Informatica and Ab initio.
  • Hands on experience in SQL queries and optimizing the queries in Oracle, SQL Server, DB2, and Netezza & Teradata.
  • Experience in Data Modeling / Analysis for Enterprise Data Warehouse (OLAP), Master and Reference Data Management (MDM), and OLTP systems. Including Microstrategy Desktop/Developer, Web, Architect, OLAP Services, Administrator and Intelligence server.
  • Experience in designing Star schema, Snowflake schema for Data Warehouse, ODS architecture.
  • Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.
  • Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, ER Studio, TOAD Modeler and SQL Modeler.
  • Experience in designing error and exception handling procedures to identify, record and report errors.
  • Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse projects.
  • Excellent knowledge on creating reports on SAP Business Objects, Webi reports for multiple data providers.
  • Experience in Data transformation and Data mapping from source to target database schemas and also data cleansing.
  • Experience in Ralph Kimball and Bill Inmon approaches.
  • Experience in migration of data from Excel, DB2, Sybase, Flat file, Teradata, Netezza, Oracle to MS SQL Server using BCP and DTS utility and extracting, transforming and loading of data.
  • Experience in using Oracle, SQL*PLUS , and SQL*Loader .
  • Experience in automating and scheduling the Informatica jobs using UNIX shell scripting configuring Korn-jobs for Informatica sessions.
  • Knowledge and working experience on big data tools like Hadoop, Azure Data Lake, AWS Redshift.
  • Experience in Business Intelligence (BI) project Development and implementation using Microstrategy product suits
  • Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
  • Experience in developing MapReduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Experience of using USQL which combines a declarative SQL, with C# programming.
  • Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
  • Strong background in various Data Modeling tools using ERWIN, IBM Data Architect, Power Designer, MS Visio.
  • Experience with relational (3NF) and dimensional data architectures. Experience in leading cross-functional, culturally diverse teams to meet strategic, tactical and operational goals and objectives.
  • Extensive experience in Relational Data Modeling, Logical data model/Physical data models Designs, ER Diagrams, Forward and Reverse Engineering, Publishing ERWIN diagrams, analyzing data sources and creating interface documents.
  • Experience in working with business intelligence and data warehouse software, including SSAS, Pentaho, Cognos, OBIEE, Greenplum Database, Amazon Redshift and Azure Data Warehouse.


Cloud Platforms: Amazon EC2, EC3, Elastic Search, Elastic Load Balance.

OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports.

ETL/Data warehouse Tools:: Informatica 9.6/9.1/8.6.1/8.1 , SAP Business Objects XIR3.1/XIR2, Web Intelligence, Talend, Pentaho.

Database Tools: Microsoft SQL Server 16.0, Teradata 16.0, Oracle 12c/11g and MS Access.

Version Tool: VSS, SVN, CVS.

Data Modeling Tools: Rational System Architect, IBM Infosphere Data Architect, Erwin 9.7, E/R Studio 17, Power Designer and Oracle SQL Developer.

Operating System: Windows, UNIX, Sun Solaris.

Packages: Microsoft Office 2016, Microsoft Project 2016, SAP and Microsoft Visio, Share point Portal Server.

Programming Languages: SQL, PL/SQL, HTML5 and XML.

Methodologies: Agile, Ralph Kimball, BillInmon s data warehousing methodology, RUP, RAD and JAD.

Programming Languages: Oracle 12c, PL/SQL, UNIX, Shell Scripting

Big Data Technology: HBase 1.2, HDFS, Sqoop 1.4, Hadoop 3.0, Hive 2.3, PIG.

Testing and defect tracking Tools:: HP/Mercury (Quality Center, Win Runner, Quick Test Professional, Performance Center, Requisite, MS Visio & Visual Source Safe.


Confidential, Minneapolis MN

Sr. Data Architect/ Modeler


  • Involved in the development of Logical data model from the business requirement using the power designer . Also worked with the team in the generation of Physical data model
  • Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning and ETL.
  • Developed U-SQL Scripts for schematizing the data in Azure Data Lake Analytics.
  • Used Polybase for ETL/ELT process with Azure Data Warehouse to keep data in Blob Storage with almost no limitation on data volume.
  • Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations to facilitate one time, Daily, Monthly and Yearly Loading of Data.
  • Experience in working with Cloudera, Hortonworks, and Microsoft Azure HDINSIGHT Distributions.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Responsible for Big data initiatives and engagement including analysis, brain storming, POC, and architecture.
  • Working on data profiling and analysis to create test cases for new Architecture evaluation.
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Interacted with Business Analyst, SMEs and other Data Architects to understanding Business needs and functionality for various project solutions.
  • Worked on various Azure services like Compute (Web Roles, Worker Roles), Azure Websites, Caching, SQL Azure, NoSQL, USQLS, Storage, Network services, Data Factory, Azure Active Directory, API Management, Scheduling, Auto Scaling, and PowerShell Automation.
  • Developing full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Developed long term data warehouse roadmap and Architectures, Designs and builds the data warehouse framework per the roadmap.
  • Developed complex ETL jobs from various sources such as SQL server, Postgressql and other files and loaded into target databases using Talend OS ETL tool.
  • Interact with business community and gathered requirements based on changing needs. Incorporated identified factors into Talend jobs to build the Data Mart.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database
  • Modeled new tables and added them to the existing data model using Power Designer as part of data modeling .
  • Massively involved in Data Architecture/ Modeler role to review business requirement and compose source to target data mapping documents.
  • Worked on Informatica - Repository Manager, Designer, Workflow Manager & Workflow Monitor
  • Extensively worked on Informatica tools such as Source Analyzer, Data Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer to designed, developed and tested complex mappings and mapplets to load data from external flat files and RDBMS.
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide Sql access on Hadoop data.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from NoSQL and a variety of portfolios.
  • Experience in managing multi-tenant Cassandra clusters on public cloud environment - Amazon Web Services (AWS)-EC2.
  • Implemented Apache Kafka to provide a unified, high-throughput, low-latency platform for handling real-time data feeds.
  • Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning, ETL, and BI.
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSql, and Big Data technologies.
  • Worked with the ETL team to document the SSIS packages for data extraction to Warehouse environment for reporting purposes.
  • Documented ER Diagrams, Logical and Physical models, business process diagrams and process flow diagrams.
  • Used SSRS to create reports, customized Reports, on-demand reports, ad-hoc reports and involved in analyzing multi-dimensional reports in SSRS.
  • Presented the data scenarios via, Erwin logical models and excel mockups to visualize the data better.
  • Responsible to create conceptual, logical and physical data models, of disparate information for report development.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Conducted and participated in JAD sessions with the users, modelers, and developers for resolving issues.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.
  • Worked with various Teradata15 tools and utilities like Teradata Viewpoint, Multi Load, ARC, Teradata Administrator, BTEQ and other Teradata Utilities.
  • Created one framework in data stage with all tables
  • Created DDL scripts using Erwin and source to target mappings to bring the data from source to the warehouse.
  • Proficient in SQL across a number of dialects (we commonly write MySQL, PostgreSQL, Sql Server, and Oracle)
  • Hand on Cloud computing using Microsoft Azure with various BI Technologies.
  • Involved in creating Hive tables, and loading and analyzing data using hive queries Developed Hive queries to process the data and generate the data cubes for visualizing Implemented.
  • Developed Data Mapping, Data Governance, Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Developed and configured on Informatica MDM hub supports the Master Data Management (MDM), Business Intelligence (BI) and Data Warehousing platforms to meet business needs.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Extensively used Erwin r9.7 for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
  • Extensively used agile methodology as the Organization Standard to implement the data Models.
  • Working on several ETL Ab Initio assignments to perform extract, transform and load data into Teradata and Oracle databases which had complex data models of Relational, Star and Snowflake schemas.
  • Working on SSIS development using meta-data driven Architecture.
  • Identified and streamlined complex queries which were causing iterations and effecting database and system performance.
  • Coordinated with the Business Analyst and prepared Logical and Physical Data-models as per the requirements involving multiple subject areas, domains and naming standards.
  • Created Data stage jobs (ETL Process) for populating the data into the Data Warehouse constantly from different source systems like ODS, flat files, scheduled the same using Data Stage for System Integration testing.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
  • Worked with reverse engineering Data Model from Database instance and Scripts.
  • Metadata integration across Enterprise Data dictionary, Catalogs, Data Models and Informatica establishing lineage and traceability.
  • Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
  • Worked on AWS and architecting a solution to load data creates data models and run BI on it.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.

Environment: Erwin 9.7, HBase 1.2, NoSQL, OLTP, OLAP, Teradata 15, Sybase Power Designer 15, Netezza, SQL Architect, MySQL, Oracle 12c, AWS, Hive 2.3, HDFS, Informatica, Snow- Flake, Star Schema, Sql, Hadoop, SSIS .

Confidential, New York, NY

Data Architect /Modeler


  • Scheduled Sessions and Batches on the Informatica Server using Informatica Server Manager/Workflow Manager .
  • Worked with pmcmd to interact with Informatica Server from command mode and execute the shells scripts.
  • Experience of process and transform data by running USQL scripts on Azure.
  • Deployed the packages on staging and production. Monitored jobs and supported Azure Data Lake production environment.
  • Worked with Data governance, Data quality, data lineage, Data architect to design various models and processes.
  • Worked on AWS and architecting a solution to load data, create data models.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Involved in Data Profiling, Data cleansing and make sure the data is accurate and analyzed when it is transferring from OLTP to Data Marts and Data Warehouse.
  • Worked on SQL Server concepts SSIS (SQL Server Integration Services), SSAS (Analysis Services) and SSRS (Reporting Services).
  • Generated and DDL (Data Definition Language) scripts using ER Studio and assisted DBA in Physical Implementation of Data Models.
  • Extensively worked on creating the migration plan to Amazon web services (AWS).
  • Extracted Mega Data from AWS, and Elastic Search engine using Sql Queries to create reports.
  • Completed enhancement for MDM (Master data management) and suggested the implementation for hybrid MDM (Master Data Management).
  • Exported data from HDFS environment into RDBMS using Sqoop for report generation and visualization purpose.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct Data Analysis.
  • Performed Data Analysis, Data Migration and data profiling using complex SQL on various sources systems including Oracle and Teradata
  • Designed and documented Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object Oriented Design) using UML and Visio.
  • Generated periodic reports based on the statistical analysis of the data using SQL Server Reporting Services (SSRS).
  • Worked with BTEQ to submit Sql statements, import and export data, and generate reports in Teradata.
  • Developed Full life cycle of Data Lake, Data Warehouse with Big data technologies like Spark and Hadoop.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Responsible for all metadata relating to the EDW's overall data architecture, descriptions of data objects, access methods and security requirements.
  • Worked extensively on ER Studio for multiple Operations across Atlas Copco in both OLAP and OLTP applications.
  • Developed the logical data models and physical data models that capture current state/future state data elements and data flows using ER Studio.
  • Responsible for Data Architecture, Data Modeling, Data Integration, Data quality & Metadata management solution design and delivery for Enterprise EDW environment.
  • Worked very close with Data Architectures and DBA team to implement data model changes in database in all environments.
  • Participated in all phases of project including Requirement gathering, Architecture, Analysis, Design, Coding, Testing, Documentation and warranty period.
  • Involved in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams with all related entities and relationship with each entity based on the rules provided by the business manager using ER Studio.
  • Designed and developed 3 tier web applications hosted in Azure.
  • Utilized Azure Service Bus and Web services to handle messaging from thousands of devices, enabling smart phones to interact with vehicle telemetry.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components
  • Implemented custom Error handling in Talend jobs and worked on different methods of logging
  • Involved in the development of Logical data model from the business requirement using the power designer . Also worked with the team in the generation of Physical data model
  • Worked on Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon's Data Warehouse methodology.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow Flake Schemas.
  • Used forward engineering to generate DDL from the Physical Data Model and handed it to the DBA.
  • Integrated Spotfire visualization into client's Salesforce environment.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval and designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Involved in Planning, Defining and Designing database using ER Studio on business requirement and provided documentation.
  • Reverse Engineered DB2 databases and then forward engineered them to Teradata using ER Studio.
  • Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards
  • Used Agile Methodology of Data Warehouse development using Kanbanize.
  • Worked with the Business Analyst, QA team in their testing and DBA for requirements gathering, business analysis, testing and project coordination.
  • Worked with DBA group to create Best-Fit Physical Data Model from the Logical Data Model using Forward Engineering.
  • Worked with NoSQL databases like HBase in creating HBase tables to load large sets of semi-structured data coming from various sources.
  • Development of Data stage design concepts, execution, testing and deployment on the client server
  • Developed Linux Shell scripts by using Nzsql/Nzload utilities to load data from flat files to Netezza database.
  • Validated the data of reports by writing SQL queries in PL/SQL Developer against ODS
  • Utilized Azure SQL database, Web API, azure active directory, Data factory, Azure Websites.
  • Responsible for delivering and coordinating data-profiling, data-analysis, data-governance, data-models (conceptual, logical, physical), data-mapping, data-lineage and reference data management.

Environment: ER Studio, Amazon Redshift, AWS, OLTP, Teradata r15, Sqoop 1.4, Cassandra 3.11, MongoDB 3.6, HDFS, Linux, Shell, scripts, NoSQL, SSIS, SSAS, HBase 1.2, MDM.

Confidential - Thousand Oaks, CA

Sr. Data Modeler /Analyst


  • Coordinated with DBA in implementing the Database changes and also updating Data Models with changes implemented in development, QA and Production.
  • Created and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.
  • Worked Extensively with DBA and Reporting team for improving the Report Performance with the Use of appropriate indexes and Partitioning.
  • Designed and Developed Oracle, PL/Sql, Procedures, Linux and Unix Shell Scripts for data Import/Export and data Conversions.
  • Experienced with BI Reporting in Design and Development of Queries, Reports, Workbooks, Business Explorer Analyzer, Query Builder, Web Reporting.
  • Generated various reports using Sql Server Report Services (SSRS) for business analysts and the management team.
  • Automated and scheduled recurring reporting processes using UNIX shell scripting and Teradata utilities such as Mload, BTEQ and Fast Load
  • Participated in all phases including Analysis, Design, Coding, Testing and Documentation.
  • Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
  • Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning.
  • Data Modeler/Analyst in Data Architecture Team and responsible for Conceptual, Logical and Physical model for Supply Chain Project.
  • Extensive experience in PL/Sql programming Stored Procedures, Functions, Packages and Triggers
  • Data modeling in Erwin; design of target data models for enterprise data warehouse (Teradata)
  • Involved in Data Architecture, Data profiling, Data analysis, data mapping and Data architecture artifacts design.
  • Worked closely with Business analysts, data architects and various teams to understand the requirements and to translate them into appropriate database designs.
  • Involved in logical and Physical Database design & development, Normalization and Data modeling using Erwin and Sql Server Enterprise manager.
  • Prepared ETL technical Mapping Documents along with test cases for each Mapping for future developments to maintain Software Development Life Cycle (SDLC).
  • Designed OLTP system environment and maintained documentation of Metadata.
  • Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
  • Created dimensional model for the reporting system by identifying required dimensions and facts using
  • Used Reverse Engineering to connect to existing database and create graphical representation (E-R diagram)
  • Using Erwin modeling tool, publishing of a data dictionary, review of the model and dictionary with subject matter experts and generation of data definition language.
  • Created and maintained data model standards, including master data management (MDM) and Involved in extracting the data from various sources like Oracle, Sql, Teradata, and XML.
  • Worked with medical claim data in the Oracle database for Inpatient/Outpatient data validation, trend and comparative analysis.
  • Used Load utilities (Fast Load & Multi Load) with the mainframe interface to load the data into Teradata.
  • Optimized and updated UML Models (Visio) and Relational Data Models for various applications.

Environment: Erwin9.0, AWS, Redshift, Oracle11g, Sql Server 2010, Teradata14, XML, OLTP, PL/Sql, Linux, UNIX, Mload, BTEQ, UNIX shell scripting

Confidential - Atlanta,GA

Sr. Data Analyst/ Modeler


  • Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
  • Planned and defined system requirements to Use Case, Use Case Scenario and Use Case Narrative using the UML (Unified Modeling Language) methodologies.
  • Worked with Business Analysts team in requirements gathering and in preparing functional specifications and translating them to technical specifications.
  • Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
  • Developed Data mapping, Data Governance, Transformation and Cleansing rules for the Data Management involving OLTP, ODS and OLAP.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Reverse Engineering the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
  • Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
  • Involved in designing and developing SQL server objects such as Tables, Views, Indexes (Clustered and Non-Clustered), Stored Procedures and Functions in Transact-SQL.
  • Developed scripts that automated DDL and DML statements used in creations of databases, tables, constraints, and updates.
  • Ensured the feasibility of the logical and physical design models.
  • Worked on the Snow-flaking the Dimensions to remove redundancy.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using ErStudio.
  • Normalized the database based on the new model developed to put them into the 3NF of the data warehouse.
  • Used SQL tools like Teradata SQL Assistant and TOAD to run SQL queries and validate the data in warehouse.
  • Gather all the analysis reports prototypes from the business analysts belonging to different Business units; Participated in JAD sessions involving the discussion of various reporting needs.
  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Delivered dimensional data models using ER/Studio to bring in the Employee and Facilities domain data into the oracle data warehouse.
  • Performed analysis of the existing source systems (Transaction database)
  • Involved in maintaining and updating Metadata Repository with details on the nature and use of applications/data transformations to facilitate impact analysis.
  • Created DDL scripts using ER Studio and source to target mappings to bring the data from source to the warehouse.
  • Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata.
  • Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
  • Construct complex SQL queries with sub-queries, inline views as per the functional needs in the Business Requirements Document (BRD).
  • Created SSIS package for daily email subscriptions to alert Tableau subscription failure using the ODBC driver and PostgreSQL database.
  • Designed logical and physical data models, Reverse engineering, Complete compare for Oracle and SQL server objects using ER Studio.
  • Worked with supporting business analysis and marketing campaign analytics with data mining, data processing, and investigation to answer complex business questions.
  • Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using ER Studio Data Modeler.

Environment: PL/SQL, ER Studio, MS SQL 2008, OLTP, ODS, OLAP, ODS, OLAP, SSIS, Tableau, ODBC, Transact-SQL, TOAD, Teradata SQL Assistant


Data Analyst/ Modeler


  • Involved in complete SSIS life cycle in creating SSIS packages, building, deploying and executing the packages all environments. (QA, Development and Production)
  • Developed and programmed test scripts to identify and manage data inconsistencies and testing of ETL processes
  • Performed data analysis and data profiling using complex SQL on various sources systems.
  • Used the MS Excel, MS Access for data pulls and ad hoc reports for analysis.
  • Performance tuning and Optimization of large database for fast accessing data and reports Ms Excel.
  • Created conceptual, logical and physical data models using best practices and company standards to ensure high data quality and reduced redundancy.
  • Developed complex T-Sql code such as Stored Procedures, functions, triggers, Indexes, and views for the business application.
  • Involved in Business and Data analysis during requirements gathering.
  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirement.
  • Created SSIS Packages for migration of data to MS Sql Server database from other databases and source like Flat Files, MS Excel, Sybase, CSV files.
  • Developed reports for users in different departments in the organization using SQL Server Reporting Services (SSRS).
  • Performed in depth analysis in data & prepared weekly, biweekly, monthly reports by using MS Excel, MS Access, SQL, and UNIX
  • Extracted data from existing data source and performed ad-hoc queries by using SQL and UNIX.
  • Actively participated in the design of data model like conceptual, logical models using Erwin.
  • Performed logical data modeling, physical data modeling (including reverse engineering) using the Erwin Data Modeling tool.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Project involves production, test, and development administration and support for client's existing DB2 UDB platform running DB2 UDB v9.1 and v8.2 on servers under various operating system.
  • Performed Data analysis for the existing Data warehouse and changed the internal schema for performance.
  • Used MS Visio for business flow diagrams and defined the workflow.
  • Worked and extracted data from various database sources like DB2, CSV, XML and Flat files into the DataStage.
  • Analyzed user requirements & worked with data modelers to identify entities and relationship for data modeling.
  • Developed Sql queries for Extracting data from production database and built data structures and reports.
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN
  • Used forward engineering to create a physical data model with DDL that best suits the requirements from the Logical Data Model.
  • Created pivot tables and charts using worksheet Data and external resources, modified pivot tables, sorted items and group Data, and refreshed and formatted pivot tables.

Environment: Erwin 8.0, MS Excel, MS Access, DataStage, Star Schema, OLTP, ODS, OLAP, OLTP, UNIX, T-Sql, MS Sql 2008, Excel, SSIS, SSRS, Stored Procedures, functions, triggers, MS Visio.

Hire Now