Sr. Data Modeler/ Data Analyst Resume
Washington, DC
SUMMARY
- Above 9+ years of experience in Data Analyst and Data Modeling, Data Development, Implementation and Maintenance of databases and software applications.
- Expert in writing SQL queries and optimizing the queries in Oracle, SQL Server and Teradata.
- Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica Power Center Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
- Solid Excellent experience in creating cloud based solutions and architecture using Amazon Web services (Amazon EC2, Amazon S3, Amazon RDS) and Microsoft Azure.
- Excellent knowledge on creating reports on SAP Business Objects, Webi reports for multiple data providers.
- Strong experience with different project methodologies including Agile Scrum Methodology and Waterfall SDLC methodology.
- Solid hands on experience with administration of data model repository, documentation in Meta data portals in such as Erwin, ER Studio and Power Designer tools.
- Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
- Excellent experience on using Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, and Exposure to Tpump on UNIX/Windows environments and running the batch process for Teradata.
- Experience in designing Star schema, Snowflake schema for Data Warehouse, ODS architecture.
- Experience in designing Logical, Physical & Conceptual data models for to build the Data Warehouse.
- Specialization in Data Modeling, Data warehouse design, Building conceptual Architect, Data Integration and Business Intelligence Solution.
- Experience working with Agile, Waterfall methodologies, Ralph Kimball and Bill Inmon approaches.
- Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
- Having good knowledge in Normalization and De-Normalization techniques for optimum performance in relational and dimensional database environments.
- Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Experienced working with Excel Pivot and VBA macros for various business scenarios.
- Extensive experience in supporting Informatica applications, data extraction from heterogeneous sources using Informatica Power Center.
- Experience in automating and scheduling the Informatica jobs using UNIX shell scripting configuring Korn-jobs for Informatica sessions.
- Experience in designing error and exception handling procedures to identify, record and report errors.
- Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse projects.
- Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
- Excellent experience in troubleshooting test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
- Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
- Experience in testing Business Intelligence reports generated by various BI Tools like Cognos and Business Objects.
- Excellent in creating various artifacts for projects which include specification documents, data mapping and data analysis documents.
TECHNICAL SKILLS
Data Modeling Tools: Erwin Data Modeler 9.7/9.6, ER Studio v17, and Power Designer.
OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports 9
Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.
Big Data technologies: HBase 1.2, HDFS, Sqoop 1.4, Spark, Hadoop 3.0, Hive 2.3, EC2, S3 Bucket, AMI, RDS
Reporting Tools: SSRS, Power BI, Tableau, SSAS, MS-Excel, SAS BI Platform.
Programming Languages: SQL, PL/SQL, HTML5.6, XML and VBA.
Cloud Platforms: AWS, EC2, EC3, Redshift & MS Azure
Operating System: Windows, Unix, Sun Solaris
Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model
ETL/Data warehouse Tools: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Talend, and Pentaho.
PROFESSIONAL EXPERIENCE
Confidential - Washington, DC
Sr. Data Modeler/ Data Analyst
Responsibilities:
- Interacted with Business Analyst, SMEs and other Data Architects to understanding Business needs and functionality for various project solutions
- Worked as a Data Modeler/Analyst to generate Data Models using SAP PowerDesigner and developed relational database system.
- Worked on Software Development Life Cycle (SDLC) with good working knowledge of testing, Agile methodology, disciplines, tasks, resources and scheduling.
- Worked with business stakeholders to gather data quality requirements, action data profiling, and produce metrics to identify opportunities for improved testing and reporting.
- Developing full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
- Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
- Developed logical data models and physical database design and generated database schemas using SAP PowerDesigner.
- Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
- Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
- Involved inDimensionalmodeling(Star Schema) of theDatawarehouse and used PowerDesigner to design the business process, dimensions and measured facts.
- Implemented Forward Engineering by using DDL scripts and generating indexing strategies.
- Reverse Engineered physical data models from SQL Scripts and databases.
- Worked with Data Analytics, Data Reporting, Ad-hoc Reporting, Graphs, Scales, PivotTables and OLAP reporting.
- Designed ER diagrams and mapping the data into database objects.
- Involved in Data profiling in order to detect and correct inaccurate data and maintain the data quality.
- Implemented data quality process including transliteration, parsing, analysis, standardization, and enrichment at point of entry and batch modes.
- Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture.
- Lead Data Governance process with external and internal data providers to ensure timely, accurate and complete data.
- Researched and developed hosting solutions using MS Azure for service solution.
- Document data dictionaries and business requirements for key workflows and process points
- Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
- Worked on PL/SQL collections, index by table, arrays, bulk collect, FOR ALL, etc.
- Create and communicate the strategy and use of Big Data.
- Wrote test plans and test cases in compliance with organizational standards.
- Designed the data warehouse architecture for all the source systems using MS Visio.
- Responsible for different Data mapping activities from Source systems to Teradata.
- Developed and maintained an Enterprise Data Model (EDM) to serve as both the strategic and tactical planning vehicles to manage the enterprise data warehouse. This effort involves working closely with the business.
- Monitored the Data quality and integrity of data was maintained to ensure effective functioning of department.
- Worked with project management, business teams and departments to assess and refine requirements to design/develop BI solutions using MS Azure.
- Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
- Worked with MDM systems team with respect to technical aspects and generating reports.
- Worked with the data team to profile source data and determine source and metadata characteristics.
- Involved in Data profiling, Data analysis and data mapping artifacts design.
- Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
- Worked on PL/SQL collections, index by table, arrays, bulk collect, FOR ALL, etc.
- Performed detailed data analysis to analyze the duration of claim processes and created the cubes with Star Schemas using facts and dimensions through SQL Server Analysis Services (SSAS).
- Generated various reports using SQL Server Report Services (SSRS) for business analysts and the management team.
- Designed OLTP system environment and maintained documentation of Metadata.
- Analyzed results from data validation queries to present to user groups.
- Create and communicate the strategy and use of Big Data.
- Responsible for the data architecture design delivery, data model development, review, approval and Data warehouse implementation.
- Testing responsibilities included unit testing, integration testing, and business acceptance testing.
- Applied appropriate level of abstraction in designs and confirmed that Data designs support the integration of data and information flow across systems and platforms.
Environment: SAP PowerDesigner16.6, Agile, MDM, pl/Sql, SSAS, SSRS, ETL, OLTP, SQL Scripts, Big data, NoSQL, MS Visio, MS Azure.
Confidential - Columbus, OH
Sr. Data Modeler/ Data Analyst
Responsibilities:
- Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
- Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
- Worked on NoSQL databases including Cassandra. Implemented multi-data center and multi-rack Cassandra cluster.
- Coordinated with Data Architects on AWS provisioning EC2 Infrastructure and deploying applications in Elastic load balancing.
- Performed Reverse Engineering of the current application using Erwin, and developed Logical and Physical data models for Central Model consolidation.
- Translated logicaldatamodels into physical database models, generated DDLs for DBAs
- PerformedDataAnalysis andDataProfiling and worked ondatatransformations anddataquality rules.
- Involved in extensivedatavalidation by writing several complex SQL queries and Involved in back-end testing and worked withdataquality issues.
- Collected, analyze and interpret complexdatafor reporting and/or performance trend analysis
- Wrote and executed unit, system, integration and UAT scripts in adatawarehouse projects.
- Extensively used ETL methodology for supportingdataextraction, transformations and loading processing, in a complex DW using Informatica.
- Developed and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access.
- Involved in writing T-SQL working on SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration.
- Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of Data Analysis responsibilities.
- Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
- Written complex SQL queries for validating thedataagainst different kinds of reports generated by Business Objects XIR2
- Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class words Standards Document.
- Worked in importing and cleansing ofdatafrom various sources like Teradata, Oracle, flat files, with high volumedata
- Written SQL scripts to test the mappings and Developed Traceability Matrix of Business
- Involved in extensivedatavalidation by writing several complex SQL queries and Involved in back-end testing and worked withdataquality issues.
- Created SQL tables with referential integrity, constraints and developed queries using SQL, SQL*PLUS and PL/SQL.
- Performed GAP analysis of current state to desired state and document requirements to control the gaps identified.
- Developed the batch program in PL/SQL for the OLTP processing and used Unix Shell scripts to run in corn tab.
- Identified & record defects with required information for issue to be reproduced by development team.
- Worked on the reporting requirements and involved in generating the reports for the Data Model using crystal reports
- Trained junior data modelers on data architecture standards and data modeling standards.
- Created views and dashboards on end client's data. Produced powerful dashboards telling story behind the data in an easy to understand format such as pie, bar, geo, and line charts that are viewed daily by senior Management.
Environment: Erwin 9.0, PL/SQL, Business Objects XIR2, Informatica 8.6, Oracle 11g, Teradata R13, Teradata SQL Assistant 12.0, PL/SQL, Flat Files
Confidential - Eden Prairie, MN
Sr. Data Analyst/ Data Quality Analyst
Responsibilities:
- Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
- Review the transformation specifications written by various work streams to propose additional DQ rules.
- Collaborate with a small team of skilled professionals to create a center of excellence in develop data governance and management.
- Extensively used Agile methodology as the Organization Standard to implement the data Models
- Communicate with all business functions to maintain a comprehensive understanding of data quality requirements.
- Validate existing Data Quality rules to ensure they meet Data Governance requirements.
- Determine data quality requirements by studying business functions; gathering information; evaluating output requirements and formats.
- Maintained Referential Integrity by Introducing foreign keys and normalized the existing data structure to work with the ETL team and provided source to target mapping to implement incremental, full and initial loads into the target data mart.
- Worked on normalization techniques. Normalized the data into 3rd Normal Form (3NF).
- Arranged various guiding sessions for Programmers, Engineers, System Analysts and others for clarification of performance requirements, interfaces project capabilities and limitations.
- Developed solutions for data quality issues and collaborate with the business and IT to implement those solutions.
- Wrote SQL queries (aggregates, conditional statements, sub queries) and analyze large data sets resulting in problem resolutions and improvement recommendations.
- Analyzed and presented the gathered information in graphical format for the ease of business managers.
- Produced Source to target data mapping by developing the mapping spreadsheets.
- Establish data quality KPIs and metrics and create the reporting necessary to measure and communicate the status toward achievement of data quality targets.
- Executed the strategy for Enterprise Data functional, SOA, API and data quality testing.
- Implemented Hadoop based test frameworks and Selenium based test frameworks
- Provided IT and business partners consultation on using the Hadoop based testing platform effectively
- Validated business data objects to ensure the accuracy and completeness of the database.
- Supported SalesForce.com maintenance with services such as periodic data cleansing and workflow.
- Implemented ETL techniques for Data Conversion, Data Extraction and Data Mapping for different processes as well as applications.
- Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class words Standards Document.
- Involved in Data profiling in order to detect and correct inaccurate data and maintain the data quality.
- Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
- Reviewed different codes and designs to ensure no ERRORS in the systems and to recommend required update if needed.
- Identified and tracked the slowly changing dimensions (SCD I, II, III & Hybrid/6) and determined the hierarchies in dimensions.
- Generated various reports using SQL Server Report Services (SSRS) for business analysts and the management team.
- Designed OLTP system environment and maintained documentation of Metadata.
- Using Informatica & SAS to extract transform & load source data from transaction systems.
Environment: IDQ 9.5, OLTP, DBAs, ETL, DDL, DML, SQL, Data Mapping, Metadata, OLTP, SAS, Informatica 9.5
Confidential - New York, NY
Sr. Data Modeler/ Data Analyst
Responsibilities:
- Worked with Business Analysts team in requirements gathering and in preparing functional specifications and translating them to technical specifications.
- Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
- Planned and defined system requirements to Use Case, Use Case Scenario and Use Case Narrative using the UML (Unified Modeling Language) methodologies.
- Gather all the analysis reports prototypes from the business analysts belonging to different Business units; Participated in JAD sessions involving the discussion of various reporting needs.
- Reverse Engineering the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
- Conduct Design discussions and meetings to come out with the appropriate Data Warehouse at the lowest level of grain for each of the Dimensions involved.
- Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
- Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using Erwin Data Modeler.
- Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
- Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
- Conduct Design reviews with the business analysts and content developers to create a proof of concept for the reports.
- Ensured the feasibility of the logical and physical design models.
- Worked on the Snow-flaking the Dimensions to remove redundancy.
- Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
- Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
- Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
- Developed Data mapping, Data Governance, Transformation and Cleansing rules for the Data Management involving OLTP, ODS and OLAP.
- Created data masking mappings to mask the sensitive data between production and test environment.
- Normalized the database based on the new model developed to put them into the 3NF of the data warehouse.
- Used SQL tools like Teradata SQL Assistant and TOAD to run SQL queries and validate the data in warehouse.
- Created SSIS package for daily email subscriptions to alert Tableau subscription failure using the ODBC driver and PostgreSQL database.
- Designed logical and physical data models, Reverse engineering, Complete compare for Oracle and SQL server objects using Erwin.
- Construct complex SQL queries with sub-queries, inline views as per the functional needs in the Business Requirements Document (BRD).
- Worked with supporting business analysis and marketing campaign analytics with data mining, data processing, and investigation to answer complex business questions.
- Involved in designing and developing SQL server objects such as Tables, Views, Indexes (Clustered and Non-Clustered), Stored Procedures and Functions in Transact-SQL.
- Developed scripts that automated DDL and DML statements used in creations of databases, tables, constraints, and updates.
- Created Erwin reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs' to apply the data model changes
Environment: PL/SQL, Erwin8.5, MS SQL 2008, Html, OLTP, ODS, OLAP, OLTP, ODS, OLAP, SSIS, Tableau, ODBC, Transact-SQL, TOAD, Teradata SQL Assistant
Confidential
Data Analyst/Data Modeler
Responsibilities:
- Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
- Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
- Optimized the existing procedures and SQL statements for the better performance using EXPLAIN PLAN, HINTS, SQL TRACE and etc. to tune SQL queries.
- The interfaces were developed to be able to connect to multiple databases like SQL server and oracle.
- Assisted Kronos project team in SQL Server Reporting Services installation.
- Developed SQL Server database to replace existing Access databases.
- Attended and participated in information and requirements gathering sessions
- Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
- Designed and created web applications to receive query string input from customers and facilitate entering the data into SQL Server databases.
- Performed thorough data analysis for the purpose of overhauling the database using SQL Server.
- Designed and implemented business intelligence to support sales and operations functions to increase customer satisfaction.
- Converted physical database models from logical models, to build/generate DDL scripts.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Extensively used ETL to load data from DB2, Oracle databases.
- Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
- Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
- Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas
- Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
- Worked and experienced on Star Schema, DB2 and IMS DB.
Environment: Oracle, PL/SQL, DB2, Erwin7.0, Unix, Teradata SQL Assistant, Informatica, OLTP, OLAP, Data Marts, DQ analyzer
