We provide IT Staff Augmentation Services!

Sr. Data Analyst/data Modeler Resume

New York, NY

SUMMARY:

  • Over 8 years of experience in Information Technology in Data Modeling, Data Analysis, Design and development of Databases for business applications in Data Warehousing Environments
  • Experience in various domains like Insurance, Health Care, Logistics, Financial and Banking.
  • Expert knowledge in SDLC (Software Development Life Cycle) and was involved in all phases in projects process and familiar with Agile, Scrum, and Waterfall methodologies.
  • Expert in Business Requirement Document (BRD), Technical Specification Document (TSD), Customer Relationship Management (CRM), Business Rules Management (BRM), Business Rules Engine (BRE).
  • Worked extensively on forward engineering, reverse engineering and naming standards processes. Created DDL scripts for implementing data modeling changes.
  • Experience in relational, dimensional, and multidimensional data modeling for creating conceptual data modeling.
  • Experience in designing of star schema, snowflake schema for online analytical processing (OLAP) and online transactional processing (OLTP) systems and developing Conceptual, Logical and Physical data models.
  • Worked on projects related to Data Modeling, System/Data Analysis, Design and Development for both OLTP and Data warehousing environments
  • Strong hands - on experience in Data Modeling tools like Erwin, Power Designer and ER Studio for developing Entity-Relationship diagrams
  • Efficient in Data Warehousing/Data Mart design. Maintained data integrity and consistency by passing data through several analysis steps such as parsing and prototyping.
  • Experience in Ralph Kimball and Bill Inmon approaches.
  • Well versed with Normalization (1NF, 2NF, 3NF) / De-normalization techniques in relational/dimensional database environments for optimum performance.
  • Significance exposure to Talend ETL Tool. Well versed with analyzing data flows for validations and to check data quality issues and also optimized the procedures and functions utilized by ETL packages to reduce ETL process time.
  • Experience in performance tuning and query optimization techniques in transactional and data warehouse environments.
  • Good knowledge on writing SQL queries using stored procedures, views, indexes and joins
  • Experience in data transformation, data profiling and data mapping from source to target database schemas and data cleansing and also keep track of data quality.
  • A very good command over SQL queries, SQL Plus, PL/SQL Packages, Procedures, Functions and performance analysis indexes, indexed views, creating partitions, aggregating tables when required
  • Experience in creating tables, constraints, views, and materialized views using ERwin, ER Studio, Power Designer and SQL Modeler.
  • Strong communication and interpersonal skills with a proven track record of ability, adaptability, creativity and innovation along with demonstration of very strong technical and managerial skills and while successfully leading teams to strict project deadlines.

TECHNICAL SKILLS:

Data Modeling Tools: ERwin r7.1, 7.2, r8.2, r9.1, r9.5, and r9.64, Embarcadero ER Studio, Enterprise Architect, Oracle Data Modeler, Sybase Power Designer.

Database Tools: DB2, Microsoft SQL Server 2005, 2008, 2012, and 2014, MS Access 2000, MySQL, Oracle 8i, 9i, 10g, 11g, and 12c, PostgreSQL, Sybase ASE, and V2R6.

ETL Tools: Ab-Initio, Data Junction, DataStage, Informatica 6.2, and 7.1, SSIS, Talend Open Studio, Autosys.

OLAP Tools: Microsoft Analysis Services, Business Objects and Crystal Reports 9.

Other Tools: SAP ERP EHP, SAP GUI.

Packages: Microsoft Office Suite, Microsoft Project 2010, and Microsoft Visio.

Programming Languages: SQL, PL/SQL, HTML, XML.

PROFESSIONAL EXPERIENCE:

Confidential, New York, NY

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Participated in requirement gathering with business users to understand and document the business requirements in alignment to the financial goals of the company.
  • Designed logical and physical data models using data provisioning and consumption techniques.
  • Developed scripts that automated DDL and DML statements used in creations of databases, tables, constraints, and updates.
  • Worked on reverse engineering on the existing data models and updates the data models.
  • Participated with application developers and DBAs to discuss about various De-normalizations, partitioning and indexing schemas for physical model.
  • Created naming standards for data attributes and metadata to track data source, load frequency, generated key values, and data dictionary.
  • Created ETL jobs and custom transfer components to move data from smaller teams to centralized area (SQL Server) to meet minimum capital requirements using SSIS.
  • Participated with application developers and DBAs to discuss about various De-normalization, partitioning and indexing schemas for physical model.
  • Conduct data modeling sessions with business users and technical staff which includes conveying complex ideas both verbally and in written form
  • Created Data Dictionaries, Source to Target Mapping Documents and documented Transformation rules for all the fields.
  • Closely interacted with the Business Users and was involved in gathering, assimilation, organization, categorization and analysis of business requirements
  • Created Mapping documents using the Metadata extracted from the Metadata repositories
  • Isolate gap between technical requirements and business requirements. Understand database design - review and discuss pre-approved business requirements from the team and begin identifying and collecting specific system requirements (technology specs that will meet the business needs).
  • Perform Data Analysis on the source data in order to understand the relationship between the entities
  • Continuously supported the production on-call support and resolved the tickets with short term and long term fixes.
  • Worked with testing team to validate and review the test scripts, test cases, test results.
  • BVR(Business Verification reports) as a part of production support.

Environment: ERwin 9.2, Oracle 11g, SQL Server 2012/2014, Visual Studio 2012, SQL Developer, SSIS, SSRS, SSAS, TOAD Data Analyst 3.1, Microsoft Visual studio 2008/2012, and MS Office suite.

Confidential, Irvine, CA

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Gathered business requirements, interacted with business analysts to finalize requirements.
  • Translated existing databases into models to understand objects and relations using reverse engineering function.
  • Performed walkthroughs for conceptual, logical and physical data model for validating designs for the enterprise data warehouse.
  • Designed LDM (Logical Data Model) and PDM (Physical Data Model) using IBM Infosphere Data Architect data modeling tool and Oracle PL/SQL
  • Involved with implementing Kimball methodology to design data marts.
  • Implemented primary key, foreign key constraints using RI (referential integrity).
  • Used normalization and de-normalization techniques to achieve optimum performance of the database.
  • Responsible for generating in database scripts using forward engineer.
  • Identified SCD (Slowly Changing Dimensions) and categorized them under different types like Type-1 and Type-2.
  • Involved with data analysis using Teradata SQL assist from data source.
  • Design the documents which included data flow model in VISIO, data mapping documents, design document, which will assist the developer.
  • Worked with the ETL team to document the transformation rules for data migration from OLTP to Warehouse environment for reporting purposes.
  • Used Talend ETL tool to move data from Data Warehouse to Data Marts. Also, used to load data from files to staging tables.
  • Assist with user testing of systems, developing and maintaining quality procedures, and ensuring that appropriate documentation is in place.
  • Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and bulk collects.
  • Written SQL triggers and stored procedures.
  • Created domains and metadata reports using ERwin data modeler.
  • Resolved the data type inconsistencies between the source systems and the target system using the mapping documents.
  • Conducted design walkthroughs, which helped the developers to understand the design.
  • Generated the DDL and assisted in designing the overall ETL strategy.
  • Maintained security and data integrity of the database.
  • Carried out effective data profiling to eradicate anomalies between source and target data.

Environment: ERwin 8.x, IBM infosphere data stage, Talend Open Studio, Teradata SQL Assistant, SQL Server 2012.

Confidential, SFO, CA

ETL Data Analyst

Responsibilities:

  • Translate business requirements and models into feasible and acceptable data warehouse designs.
  • Design and build appropriate data repositories and data movements (dimensional databases) to ensure that business needs are met.
  • Create database models, which serve as blue prints for project engagements of all complexities.
  • Experience modeling data in either for OLTP or Data Warehousing environments (if possible both) including conceptual, logical and physical models
  • Gathering of Reference Data, Business Rules, Transformation rules, statistical models supporting campaign identification and suppression target populations
  • Document and extract the ETL information for Integration/Operational Data Store after validating the XML payload.
  • Translate enterprise or business requirements into long-term data architecture solutions including conceptual, logical and physical models
  • Design and review logical and physical data models for OLTP and OLAP in compliance with corporate standards.
  • Used SSIS to create ETL packages to validate, extract, transform and load data into data warehouse and data mart.
  • Responsible for design, build, and maintenance of data architectures for complex, highly visible business intelligence and analytic solutions that are both flexible and scalable
  • Design and oversees the implementation of data transformation and change processing procedures to ensure that data from multiple sources is aggregated, normalized and updated and then published to multiple target environments
  • Involve in the creation of naming standards file and data mapping documents. Involved in data conversions and gap analysis from legacy systems to updated technologies using custom SQL scripting as well as ETL applications from relational solutions.
  • Involve in migration of project related tables from one environment to the other.
  • Created ETL jobs and transfer components of data from smaller teams to centralized area (SQL Server) to meet minimum capital requirements using SSIS.
  • Implement architectural designs while building solid relationships with stakeholders at all levels.
  • Work with functional analysts, developers and development managers to ensure that all solutions are deployed within agreed timelines and supported after delivery.
  • Develop reusable assets such as solution architectures, both physical and logical.
  • Create solution architecture and planned the development effort.
  • Lead and mentor the project team from both technical and functional perspectives including database development, data architecture, integration development, requirements identifications, testing, and project management.
  • Analyzing, creating and maintaining Primary indexes (PI), Secondary Indexes (SI), Join indexes (JI) Partition Primary Indexes (PPI), Multilevel Partition Primary indexes (MPPI), and compression
  • Work extensively on all phases of the project development life cycle of system analysis, design, development, testing and implementation.
  • Create architecture solution designs and documentation to drive high-quality and feasible implementations.
  • Create and manage Work Back Schedule (WBS) of Business intelligence teams.
  • Provide solutions that effectively meet the performance, extensibility, integration, capacity, and accessibility needs of applications.
  • Responsible for transforming requirements into architecture and design documents that was used by the team to implement solutions.

Environment: Erwin, ER Studio, MS Office Suite Applications, MS Visio, Business Objects XIR2, Oracle, Toad, Teradata, Data Warehouse, SDLC, Informatica, RTAM, Rule Point

Confidential, Roseland, NJ

Data Modeler/Data Analyst

Responsibilities:

  • Facilitated JAD sessions for project scoping, requirements gathering& identification of business subject areas.
  • Identified and documented detailed business rules and use cases based on requirements analysis.
  • Identified the entities and relationship between the entities to develop Conceptual Model using ERWIN.
  • Developed Logical Model from the conceptual model.
  • Worked with DBA’s to create a best-fit Physical Data Model from the logical data model.
  • Used forward engineering to generate DDL from the Physical Data Model and handed it to the DBA.
  • Analyzed and profiled data from Folder wave ( similar to Banner) source which is an ERP system fetching student data. Also worked with TES college source to get student's degree audit, degree planning and transfer articulation.
  • Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
  • Designed STAR schema for the detailed data marts and plan data marts consisting of confirmed dimensions.
  • Worked on slowly changing dimensions.
  • Worked with ETL teams to create source and target definition.
  • Responsible for defining the naming standards for data warehouse.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Created and maintained Database Objects (Tables, Views, Indexes and Partitions).
  • Involved in updating metadata repository while detailing on use of applications and data transformation to facilitate impact analysis.
  • Prototyped data visualizations using Charts, drill-down, parameterized controls using Tableau to highlight the value of analytics in Executive decision support control.
  • Developed Ad-hoc reports using Tableau Desktop, Excel.
  • Worked on SVN (like git) repository to keep track of the code changes and version history.
  • Involved in Performance Tuning of the database, which included creating indexes, optimizing SQL statements and monitoring the server.

Environment: Oracle 11g, SQL Server 2005/2008, DB2, Talend 5.6, Erwin r9.1, Crystal Reports, SSIS/SSRS, MS-SQL server manager, Microsoft Access, Tableau, SVN, Informatica Power Center, Informatica Data Quality.

Confidential, Alpharetta, GA

Data Modeler.

Responsibilities:

  • Requirement gathering from the users by participating in JAD sessions, a series of meetings were conducted with the business system users to gather the requirements for reporting.
  • Understood basic business analyst’s concepts for logical data modeling, data flow processing and data base design.
  • Created and design conceptual data models using ERwin data modeler.
  • Created logical and physical data models for dimensional data modeling using best practices to ensure high data quality and reduced redundancy with the IDW standards and guidelines.
  • Designed and developed Oracle PL/SQL procedures and UNIX shell scripts for data import/export and data conversions.
  • Performed legacy application data cleansing, data anomaly resolution and developed cleansing rule sets for ongoing cleansing and data synchronization.
  • Extensively used star schema methodologies in building and designing the logical data model into dimensional models.
  • Involved in project cycle plan for the data warehouse, source data analysis, data extraction process, transformation and loading strategy designing.
  • Worked on database design for OLTP and OLAP systems.
  • Designed a STAR schema for sales data involving shared dimensions (conformed) using ERwin Data Modeler.
  • Designed and build the OLAP cubes for star schema and snowflake schema using native OLAP service manager.
  • Extensively used Teradata utilities (BTEQ, Fast Load, Multiload, TPUMP) to import/export and load the data from oracle and flat files.
  • Performed data analysis tasks on warehouses from several sources like Oracle, DB2, and XML etc. and generated various reports and documents.
  • Created database maintenance plans for the performance of SQL server which covers database integrity checks, update database statistics and re-indexing.
  • Involved in workflows and monitored jobs using Informatica tools.
  • Used SSIS to create ETL packages to validate, extract, transform and load data into data warehouse and data mart.
  • Developed stored procedures on Netezza and SQL server for data manipulation and data warehouse population.
  • Actively involved in normalization (3NF) & de-normalization of database.
  • Involved in implementing of SQL Server 2008.
  • Developed multiple processes for Daily Data Ingestion from Client associated data vendors and Production Team, Client site employees using SSIS and SSRS.
  • Created multiple custom SQL queries in Teradata SQL workbench to prepare the right data sets for Tableau dashboards.
  • Worked with the team in mortgage domain to implement designs based on the free cash flow, acquisition, and capital efficiency.
  • Resolved the data type inconsistencies between the source systems and the target system using the mapping documents and analyzing the database using SQL queries.

Environment: ERwin 9.1, Data Modeling, Informatica Power Center9.6, Teradata SQL, PL/SQL, BTEQ, DB2, Oracle, Agile, ETL, Tableau, Cognos, Business Objects, UNIX, SQL Server 2008, TOAD, SAS, SSRS, SSIS, T-SQL etc.

Confidential

Data Modeler

Responsibilities:

  • Designing Logical Data Models and Physical Data Models using Erwin.
  • Developing the Conceptual Data Models, Logical Data models and transformed them to creating schema-using ERWIN.
  • Developing the Data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Forward Engineering the Data models, Reverse Engineering on the existing Data Models and Updates the Data models.
  • Database cloning to synchronize test and Developing Database to production database.
  • Writing Procedure and Package using Dynamic PL/SQL.
  • Created Database Tables, Views, Indexes, Triggers and Sequences and Developing the Database Structure.
  • Data cleansing process on staging tables.
  • Wrote a complex SQL, PL/SQL, Procedures, Functions, and Packages to validate data and testing process.
  • Normalizing the Data and developing Star Schema.
  • Reducing the CPU cost for the query by various hints and Tuning SQL queries.
  • Extensively performed manual Test process.
  • Extensively generated both logical and physical reports from Erwin.
  • Involved in developing the Data warehouse for the complete Actuarial Information System Application.
  • Worked with DBA's after generating DDL to create tables in database.

Environment: Erwin data modeler 7, Oracle 8i, SQL Server 2008, ER/STUDIO, Informatica, MS Office packages Word, Excel, Power point, Project, TOAD

Confidential

Business Analyst

Responsibilities:

  • Conduct JAD sessions periodically with various stakeholders at various phases of the Software Development Life Cycle (SDLC) to discuss open issues and resolve them.
  • Interacted with the Subject Matter Experts (SME's) and Stakeholders to get a better understanding of client business processes and gather business requirements.
  • Worked with Internal and External Users to define requirements and coordinated user management and developer expectations.
  • Mapped the Source information to identify the Target information.
  • Developed Use Cases using MS Visio, and a detailed project plan with emphasis on deliverables.
  • Monitored Change Requests and documented requirements, integrating them with Use Cases.
  • Developed and Implemented Test Strategies using the Test Director.
  • Interact with stakeholders to understand client business processes and requirements.
  • Gather business, functional and technical requirements
  • Study and analyze data inventory List, Identify/Categorize/Classify into Business Areas.
  • Perform attribute level gap analysis of the inventory to the data in Kalido EDW.
  • Summarize gaps and categorize them by business unit, source systems, process type.
  • Analyze complex data sets, performing ad-hoc analysis and data manipulation in Landing, Staging and Warehouse schemas using SQL.

Environment: Oracle, TOAD, Microsoft Visual Source Safe, IBM Rational Clear Case and Clear Quest, Microsoft Office.

Hire Now