We provide IT Staff Augmentation Services!

Sr. Data Modeler/ Data Analyst Resume

Edison, NJ

SUMMARY

  • Around 9 years of experience in Data Modeling/Data Analysis including Data Development, Implementation and Maintenance of databases and software applications.
  • Experience working with Agile, Waterfall methodologies, Ralph Kimball and Bill Inmon approaches.
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
  • Having good knowledge in Normalization and De - Normalization techniques for optimum performance in relational and dimensional database environments.
  • Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Specialization in Data Modeling, Data warehouse design, Building conceptual Architect, Data Integration and Business Intelligence Solution.
  • Expert in writing SQL queries and optimizing the queries in Oracle, SQL Server and Teradata.
  • Experience in designing Star schema, Snowflake schema for Data Warehouse, ODS architecture.
  • Experience in designing Logical, Physical & Conceptual data models for to build the Data Warehouse.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica Power Center Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
  • Excellent experience on using Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, and Exposure to Tpump on UNIX/Windows environments and running the batch process for Teradata.
  • Extensive experience in supporting Informatica applications, data extraction from heterogeneous sources using Informatica Power Center.
  • Experience in automating and scheduling the Informatica jobs using UNIX shell scripting configuring Korn-jobs for Informatica sessions.
  • Experience in designing error and exception handling procedures to identify, record and report errors.
  • Solid hands on experience with administration of data model repository, documentation in Meta data portals in such as Erwin, ER Studio and Power Designer tools.
  • Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
  • Experienced in various databases Design of development and Production environment involving Oracle, SQL server, Netezza, MY SQL, DB2, MS Access, Teradata.
  • Experienced working with Excel Pivot and VBA macros for various business scenarios.
  • Software Development Life Cycle (SDLC) experience including Requirements, Specifications Analysis/Design and Testing as per the Software Development Life Cycle.
  • Solid Excellent experience in creating cloud based solutions and architecture using Amazon Web services (Amazon EC2, Amazon S3, Amazon RDS) and Microsoft Azure.
  • Excellent knowledge on creating reports on SAP Business Objects, Webi reports for multiple data providers.
  • Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse projects.
  • Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
  • Excellent experience in troubleshooting test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
  • Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
  • Experience in testing Business Intelligence reports generated by various BI Tools like Cognos and Business Objects.
  • Excellent in creating various artifacts for projects which include specification documents, data mapping and data analysis documents.

TECHNICAL SKILLS

Data Modeling Tools: Erwin Data Modeler 9.7/9.6, Erwin Model Manager, ER Studio v17, and Power Designer.

Programming Languages: SQL, PL/SQL, HTML5, XML and VBA.

Reporting Tools: SSRS, Power BI, Tableau, SSAS, MS-Excel, SAS BI Platform.

Big Data technologies: HBase 1.2, HDFS, Sqoop 1.4, Spark, Hadoop 3.0, Hive 2.3, EC2, S3 Bucket, AMI, RDS

Cloud Platforms: AWS, EC2, EC3, Redshift & MS Azure

OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports 9

Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.

Operating System: Windows, Unix, Sun Solaris

ETL/Data warehouse Tools: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Talend, and Pentaho.

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model

PROFESSIONAL EXPERIENCE

Confidential - Edison, NJ

Sr. Data Modeler/ Data Analyst

Responsibilities:

  • As a Data Modeler/Data Analyst I am responsible for all data related aspects of a project.
  • Developing full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Worked as a Data Modeler/Analyst to generate Data Models using SAP PowerDesigner and developed relational database system.
  • Worked on Software Development Life Cycle (SDLC) with good working knowledge of testing, Agile methodology, disciplines, tasks, resources and scheduling.
  • Developed logical data models and physical database design and generated database schemas using SAP PowerDesigner.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Worked on master data (entities and attributes) and capture how data is interpreted by users in various parts of the organization.
  • Conducted JAD sessions with management, vendors, users and other stakeholders for open and pending issues to develop specifications.
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture.
  • Researched and developed hosting solutions using MS Azure for service solution.
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Involved in Data profiling in order to detect and correct inaccurate data and maintain the data quality.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Implemented Forward Engineering by using DDL scripts and generating indexing strategies.
  • Reverse Engineered physical data models from SQL Scripts and databases.
  • Worked with Data Analytics, Data Reporting, Ad-hoc Reporting, Graphs, Scales, PivotTables and OLAP reporting.
  • Lead Data Governance process with external and internal data providers to ensure timely, accurate and complete data.
  • Document data dictionaries and business requirements for key workflows and process points
  • Involved inDimensionalmodeling(Star Schema) of theDatawarehouse and used PowerDesigner to design the business process, dimensions and measured facts.
  • Designed ER diagrams and mapping the data into database objects.
  • Wrote test plans and test cases in compliance with organizational standards.
  • Designed the data warehouse architecture for all the source systems using MS Visio.
  • Responsible for different Data mapping activities from Source systems to Teradata.
  • Developed and maintained an Enterprise Data Model (EDM) to serve as both the strategic and tactical planning vehicles to manage the enterprise data warehouse. This effort involves working closely with the business.
  • Worked with project management, business teams and departments to assess and refine requirements to design/develop BI solutions using MS Azure.
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Worked with the data team to profile source data and determine source and metadata characteristics.
  • Involved in data lineage and Informatica ETL source to target mapping development, complying with data quality and governance standards.
  • Designed, developed data integration programs in a Hadoop environment with NoSQL data store Cassandra for data access and analysis.
  • Used Pig to extract, write complex data transformations, cleaning and processing of large data sets and storing data in HDFS.
  • Wrote and executed unit, system, integration and UAT scripts in a Data Warehouse projects.
  • Assisted in defining business requirements and created BRD (Business Requirements Document) and functional specifications documents.
  • Involved in Data profiling, Data analysis and data mapping artifacts design.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Worked on PL/SQL collections, index by table, arrays, bulk collect, FOR ALL, etc.
  • Generated various reports using SQL Server Report Services (SSRS) for business analysts and the management team.
  • Analyzed results from data validation queries to present to user groups.
  • Supported development team & QA team during process design and during performance tuning, Test Strategy and test case development.
  • Responsible for the data architecture design delivery, data model development, review, approval and Data warehouse implementation.
  • Testing responsibilities included unit testing, integration testing, and business acceptance testing.

Environment: SAP PowerDesigner16.6, Agile, MDM, pl/Sql, SSAS, SSRS, ETL, OLTP, SQL Scripts, Big data, NoSQL, MS Visio, MS Azure.

Confidential - Eden Prairie, MN

Sr. Data Analyst/ Data Quality Analyst

Responsibilities:

  • Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Review the transformation specifications written by various work streams to propose additional DQ rules.
  • Collaborate with a small team of skilled professionals to create a center of excellence in develop data governance and management.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models
  • Communicate with all business functions to maintain a comprehensive understanding of data quality requirements.
  • Validate existing Data Quality rules to ensure they meet Data Governance requirements.
  • Determine data quality requirements by studying business functions; gathering information; evaluating output requirements and formats.
  • Maintained Referential Integrity by Introducing foreign keys and normalized the existing data structure to work with the ETL team and provided source to target mapping to implement incremental, full and initial loads into the target data mart.
  • Worked on normalization techniques. Normalized the data into 3rd Normal Form (3NF).
  • Arranged various guiding sessions for Programmers, Engineers, System Analysts and others for clarification of performance requirements, interfaces project capabilities and limitations.
  • Developed solutions for data quality issues and collaborate with the business and IT to implement those solutions.
  • Wrote SQL queries (aggregates, conditional statements, sub queries) and analyze large data sets resulting in problem resolutions and improvement recommendations.
  • Analyzed and presented the gathered information in graphical format for the ease of business managers.
  • Produced Source to target data mapping by developing the mapping spreadsheets.
  • Establish data quality KPIs and metrics and create the reporting necessary to measure and communicate the status toward achievement of data quality targets.
  • Executed the strategy for Enterprise Data functional, SOA, API and data quality testing.
  • Implemented Hadoop based test frameworks and Selenium based test frameworks
  • Provided IT and business partners consultation on using the Hadoop based testing platform effectively
  • Validated business data objects to ensure the accuracy and completeness of the database.
  • Supported SalesForce.com maintenance with services such as periodic data cleansing and workflow.
  • Implemented ETL techniques for Data Conversion, Data Extraction and Data Mapping for different processes as well as applications.
  • Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class words Standards Document.
  • Involved in Data profiling in order to detect and correct inaccurate data and maintain the data quality.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Reviewed different codes and designs to ensure no ERRORS in the systems and to recommend required update if needed.
  • Identified and tracked the slowly changing dimensions (SCD I, II, III & Hybrid/6) and determined the hierarchies in dimensions.
  • Generated various reports using SQL Server Report Services (SSRS) for business analysts and the management team.
  • Designed OLTP system environment and maintained documentation of Metadata.
  • Using Informatica & SAS to extract transform & load source data from transaction systems.
  • Well experienced in documenting data relationships, business rules, allowed rules, evolved glossary and codes.

Environment: IDQ 9.5, OLTP, DBAs, ETL, DDL, DML, SQL, Data Mapping, Metadata, OLTP, SAS, Informatica 9.5

Confidential - Atlanta, GA

Sr. Data Modeler/ Data Analyst

Responsibilities:

  • Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
  • Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
  • Worked on NoSQL databases including Cassandra. Implemented multi-data center and multi-rack Cassandra cluster.
  • Coordinated with Data Architects on AWS provisioning EC2 Infrastructure and deploying applications in Elastic load balancing.
  • Performed Reverse Engineering of the current application using Erwin, and developed Logical and Physical data models for Central Model consolidation.
  • Translated logicaldatamodels into physical database models, generated DDLs for DBAs
  • PerformedDataAnalysis andDataProfiling and worked ondatatransformations anddataquality rules.
  • Involved in extensivedatavalidation by writing several complex SQL queries and Involved in back-end testing and worked withdataquality issues.
  • Collected, analyze and interpret complexdatafor reporting and/or performance trend analysis
  • Wrote and executed unit, system, integration and UAT scripts in adatawarehouse projects.
  • Extensively used ETL methodology for supportingdataextraction, transformations and loading processing, in a complex DW using Informatica.
  • Developed and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access.
  • Involved in writing T-SQL working on SSIS, SSRS, SSAS, Data Cleansing, Data Scrubbing and Data Migration.
  • Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of Data Analysis responsibilities.
  • Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
  • Written complex SQL queries for validating thedataagainst different kinds of reports generated by Business Objects XIR2
  • Worked in importing and cleansing ofdatafrom various sources like Teradata, Oracle, flat files, with high volumedata
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business
  • Involved in extensivedatavalidation by writing several complex SQL queries and Involved in back-end testing and worked withdataquality issues.
  • Created SQL tables with referential integrity, constraints and developed queries using SQL, SQL*PLUS and PL/SQL.
  • Performed GAP analysis of current state to desired state and document requirements to control the gaps identified.
  • Developed the batch program in PL/SQL for the OLTP processing and used Unix Shell scripts to run in corn tab.
  • Identified & record defects with required information for issue to be reproduced by development team.
  • Worked on the reporting requirements and involved in generating the reports for the Data Model using crystal reports

Environment: Erwin 9.0, PL/SQL, Business Objects XIR2, Informatica 8.6, Oracle 11g, Teradata R13, Teradata SQL Assistant 12.0, PL/SQL, Flat Files

Confidential - Union, NJ

Sr. Data Modeler/ Data Analyst

Responsibilities:

  • Worked with Business Analysts team in requirements gathering and in preparing functional specifications and translating them to technical specifications.
  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Planned and defined system requirements to Use Case, Use Case Scenario and Use Case Narrative using the UML (Unified Modeling Language) methodologies.
  • Gather all the analysis reports prototypes from the business analysts belonging to different Business units; Participated in JAD sessions involving the discussion of various reporting needs.
  • Reverse Engineering the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
  • Conduct Design discussions and meetings to come out with the appropriate Data Warehouse Confidential the lowest level of grain for each of the Dimensions involved.
  • Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
  • Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using Erwin Data Modeler.
  • Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
  • Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
  • Conduct Design reviews with the business analysts and content developers to create a proof of concept for the reports.
  • Ensured the feasibility of the logical and physical design models.
  • Worked on the Snow-flaking the Dimensions to remove redundancy.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
  • Involved in Data profiling and performed Data Analysis based on the requirements, which helped in catching many Sourcing Issues upfront.
  • Developed Data mapping, Data Governance, Transformation and Cleansing rules for the Data Management involving OLTP, ODS and OLAP.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Normalized the database based on the new model developed to put them into the 3NF of the data warehouse.
  • Used SQL tools like Teradata SQL Assistant and TOAD to run SQL queries and validate the data in warehouse.
  • Created SSIS package for daily email subscriptions to alert Tableau subscription failure using the ODBC driver and PostgreSQL database.
  • Designed logical and physical data models, Reverse engineering, Complete compare for Oracle and SQL server objects using Erwin.
  • Construct complex SQL queries with sub-queries, inline views as per the functional needs in the Business Requirements Document (BRD).
  • Worked with supporting business analysis and marketing campaign analytics with data mining, data processing, and investigation to answer complex business questions.
  • Involved in designing and developing SQL server objects such as Tables, Views, Indexes (Clustered and Non-Clustered), Stored Procedures and Functions in Transact-SQL.
  • Developed scripts that automated DDL and DML statements used in creations of databases, tables, constraints, and updates.

Environment: PL/SQL, Erwin8.5, MS SQL 2008, OLTP, ODS, OLAP, OLTP, ODS, OLAP, SSIS, Tableau, ODBC, Transact-SQL, TOAD, Teradata SQL Assistant

Confidential - Santa Ana, CA

Data Analyst/Data Modeler

Responsibilities:

  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
  • Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Troubleshoot test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
  • Excellent experience and knowledge on data warehouse concepts and dimensional data modelling using Ralph Kimball methodology
  • Responsible for different Data mapping activities from Source systems to Teradata
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Used CA Erwin Data Modeler(Erwin) fordata modeling(datarequirements analysis,database designetc.) of custom developed information systems, including databases of transactional systems anddata marts.
  • Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.
  • Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
  • Executed campaign based on customer requirements
  • Followed company code standardization rule
  • Performed ad hoc analyses, as needed, with the ability to comprehend analysis as needed
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
  • Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing.
  • Ensuring onsite to offshore transition, QA Processes and closure of problems & issues.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Informatica 8.1, Data Flux, Oracle 9i, Quality Center 8.2, SQL, TOAD, PL/SQL, Flat Files, Teradata

Hire Now