Data Modeler/data Analyst Resume
2.00/5 (Submit Your Rating)
Hartford, CT
SUMMARY
- Around 8 years of professional experience in Data Modeling and Data Analysis as a Proficient in gathering business requirements and handling requirements management.
- Good in system analysis, ER Dimensional Modeling, Database design and implementing RDBMS specific features.
- Good understanding of AWS, big data concepts and Hadoop ecosystem.
- Having good experience with Normalization (1NF, 2NF and 3NF) and De - normalizationtechniques for improved database performance in OLTP, OLAP, Data Warehouse and Data Martenvironments.
- Expert knowledge in SDLC (Software Development Life Cycle) and was involved in all phases in projects.
- Generated DDL's and created tables and views in the corresponding layers.
- Excellent proficiency in Agile methodologies.
- Implemented business logic using store procedures, TSQL and Triggers (DML and DDL).
- Deep analytics and understanding of Big Data and algorithms using Hadoop, MapReduce, NoSQL and distributed computing tools.
- Solid Excellent experience in creating cloud-based solutions and architecture using Amazon Web services, Amazon RDS, and Microsoft Azure.
- Good in system analysis, ER Dimensional Modeling, Database design and implementing RDBMS specific features.
- Having good experience with Normalization(1NF, 2NF and 3NF) and De-normalizationtechniques for improved database performance in OLTP, OLAP, Data Warehouse and Data Martenvironments.
- Good experience in working with different reporting tool environments like SQL Server Reporting Services (SSRS), Cognos and Business Objects.
- Solid hands on experience with administration of data model repository, documentation in metadata portals in such as Erwin, ER Studio and Power Designer tools.
- Hands on experience with Kimball's and Inmon are dimensional modeling methodology.
- Great hands on knowledge using Microsoft office like MS Word, MS Excel and MS PowerPoint.
- Having Good Experience with Normalization (1NF, 2NF and 3NF) and De-normalization techniques for improved database performance in OLTP, OLAP, Data Warehouse and Data Martenvironments.
- Knowledge and working experience on AWS tools like Lake, Amazon S3, and Amazon Redshift.
- Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Data Profiling, Data Mapping, Performance Tuning and System Testing.
- Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL for implementing the business techniques.
- Proficient in Normalization/De-normalizationtechniques in relational/dimensional database environments and have done normalizations up to 3NF.
- Good understanding of Ralph Kimball (Dimensional)&Bill Inman (Relational) modelMethodologies.
- Strong experience in using MS Excel and MS Access to dump the data and analyze based on business needs.
TECHNICAL SKILLS
- Data Modeling Tools: Erwin Data Modeler 9.7/9.6, Erwin Model Manager, ER Studio v17, and Power Designer.
- Big Data: Hadoop3.0, HDFS, Hive3.3, HBase1.2, Sqoop1.4, NoSQL, MapReduce.
- Programming Languages: SQL, PL/SQL, HTML5, XML and VBA.
- Reporting Tools: SSRS, Power BI, Tableau, SSAS, MS - Excel, SAS BI Platform.
- Cloud Platforms: AWS, EC2, EC3, Redshift & MS Azure
- OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports 9
- Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.
- Operating System: Windows, Unix, Sun Solaris
- ETL/ Data warehouse Tools: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Talend, and Pentaho.
- Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model
PROFESSIONAL EXPERIENCE
Confidential, Hartford, CT
Data Modeler/Data Analyst
Responsibilities:
- As a Data Modeler/ Analyst role to review business requirement and compose source to target data mapping documents.
- Handled Big Data using Hadoop eco system components like SQOOP, PIG and HIVE.
- Extensively used Erwin for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
- Extensively used Agile methodology as the Organization Standard to implement the data Models.
- Developed Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
- Extensively used Python in data collection programming.
- Loaded data into HiveTables from HadoopDistributedFile System (HDFS) to provide SQL-like access on Hadoopdata.
- Researched and developed hosting solutions using MS Azure for service solution.
- Worked on Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inman's Data Warehouse methodology.
- Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
- Designed the data marts in dimensional data modeling using starschemas and snowflake schemas.
- Configured match rule set property by enabling search by rules in MDM according to Business Rules.
- Developed T-SQL scripts to create databases, database objects and to perform DML and DDL tasks.
- Proficient in data mart design, creation of cubes, identifying facts & dimensions, star&snowflakeschemes and canonical model
- Worked on Cloud computing using Microsoft Azure with various BI Technologies and exploring NoSQL options for current back using Azure Cosmos DB (SQL API)
- Designed and developed T-SQL stored procedures to extract, aggregate, transform, and insert data and developed SQL Stored procedures to query dimension and fact tables in data warehouse.
- Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
- Worked on Data load using Azure Data factory using external table approach.
- Developed SSIS packages to load data from various source systems to Data Warehouse.
- Configured Businessrules and workflows for process management/Data Governance
- Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
- Used SQL Server Reporting Services (SSRS) for database reporting in Oracle.
- Worked on Data load using Azure Data factory using external table approach.
- Developed the stored Procedures, SQL Joins, SQL queries for data retrieval, accessed for analysis and exported the data into CSV, Excel files.
- Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
- Implemented Snow-flake schema to ensure no redundancy in the database.
- Involved in creating Hive tables, and loading and analyzing data using hive queries
- Utilized a range of existing big data and other DW technologies and data modeling techniques such as star-schema, snowflake, hybrid, and new design techniques optimized for data acquisition, storage and visualization, and evangelizing the data models and techniques.
- Built REST APIs to easily add new analytics or issuers into the model.
- Designed and developed a Data Lake using Hadoop for processing raw and processed claims via Hive and Informatica.
- Designed both 3NF datamodels for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
- Developed and maintained data dictionary to create metadata reports for technical and business purpose.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using Erwin.
- Developed and configured on Informatica MDM hub supports the Master Data Management (MDM), Business Intelligence (BI) and Data Warehousing platforms to meet business needs.
- Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
- Worked with project management, business teams and departments to assess and refine requirements to design/develop BI solutions using MS Azure.
- Configured Hadoop Ecosystems to read data transaction from HDFS and Hive.
- Gathered and documented the Audittrail and traceability of extracted information for dataquality.
Environment: Erwin 9.7, Big Data3.0, MS Azure, T-SQL, Python, MapReduce, MDM, API, SQL, SSIS,NoSQL, SSRS, Oracle 12c, CSV, Hive 2.3, Hadoop 3.0, OLTP, HBase1.2, OLAP, 3NF, MDM, HDFS
Confidential, Phoenix, AZ
Data Modeler
Responsibilities:
- Worked as a Data Modeler to generate Data Models using Erwin and developed relational database system.
- Worked on Software Development Life Cycle (SDLC) with good working knowledge of testing, agile methodology, disciplines, tasks, resources and scheduling.
- Worked in Agile Environment using tools.
- Developed source to target mapping documents between existing databases and cloud based NoSQL databases such as AWS to support data extracts shared with client partners.
- Worked on a Hadoop platform to implement big data solutions using Hive, Map reduce, shell scripting and Pig.
- Used Python for SQLoperations in DB, file extraction/transformation/generation.
- Conducted JAD sessions with management, vendors, users and other stakeholders for open and pending issues to develop specifications.
- Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
- Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP.
- Worked on AWS Redshift and RDS for implementing models and data on RDS and Redshift.
- Created tables, views, indexes, Partitions and generated SQL scripts using Erwin.
- Worked on data cleansing and standardization using the cleanse functions in InformaticaMDM.
- Designed the data marts using the Ralph Kimball'sDimensional Data Mart modelingmethodology using Erwin.
- Created Informatica mappings with PL/SQL Procedures/Functions to build business rules to load data.
- Used pythonAPIs for extracting daily data from multiple vendors.
- Developed PL/SQL Procedures, Functions, Cursors, Packages, Views and Materialized Views
- Used SQL Loader, external tables and import/export toolbar to load data.
- Developed complex mapping to extract data from diverse sources including flat files, RDBMS tables, legacy system files, XML files, and Applications.
- Performed Data Analysis, Data Migration and data profiling using complex SQL on various sources systems including SQL Server.
- Worked on setting up AWS, DMS and SNS for data transfer and replication.
- Developed logical data models and physical database design and generated database schemas using Erwin.
- Designed Data Flow Diagrams, E/R Diagrams and enforced all referential integrity constraints.
- Developed and maintains data models and data dictionaries, data maps and other artifacts across the organization, including the conceptual and physical models, as well as metadata repository.
- Performed extensive Data Validation, Data Verification against Data Warehouse and performed debugging of the SQL-Statements and stored procedures.
- Designed and implemented basic SQL queries for testing and report/ data validation.
- Involved in data analysis and modeling for the OLAP and OLTP environment.
- Wrote T-SQL statements for retrieval of data and Involved in performance tuning of T-SQLqueries and Stored Procedures.
- Created SSIS and SSRSpackages to export data from text file to SQL Server Database.
- Ensured the compliance of the extracts to the Data Quality Center initiatives.
- Designed reports in Access, Excel using advanced functions not limited to pivot tables, formulas.
Environment: Big Data3.0, Erwin 9.6, Agile, Python, AWS, API, PL/SQL, NoSQL, XML, SQL, OLAP, OLTP, T-SQL, SSRS, SSIS, MS Access 2016
Confidential, Trenton, NJ
Data Analyst
Responsibilities:
- Analyzed the client data and business terms from a data quality and integrity perspective.
- Performed root cause analysis on smaller self-contained data analysis tasks that are related to assigned data processes.
- Performed data analysis and data profiling using complex SQL on various sources systems.
- Built multifunction readmission reports using python pandas and Django framework.
- Coordinated with different data providers to source the data and build the Extraction, Transformation, and Loading (ETL) modules based on the requirements to load the data from source to stage and performed Source Data Analysis.
- Worked in importing and cleansing of data from various sources with high volume data.
- Worked on Data Profiling, Data cleansing, Data Mapping and Data Quality.
- Design and deploy rich Graphic visualizations with Drill Down and Drop-down menu option and Parameterized using Tableau.
- Conducted analysis on various tools available at the client to recommend the best possible option for different project.
- Document all data mapping and transformation processes in the Functional Design documents based on the business requirements.
- Delivered Enterprise Data Governance, Data Quality, Metadata, and ETL Informatica solution
- Interfaced with business users to verify business rules and communicated changes to ETL development team.
- Worked on to identify and solve the performance issues in Informatica Mapping / Sessions.
- Extracted data from the database using SAS/Access, SAS SQL procedures and create SAS data sets.
- Wrote several Teradata SQL Queries using Teradata SQL Assistant for Ad Hoc Data Pull request.
- Performed statistical data analysis and data visualization using Python.
Environment: SQL, ETL, Tableau, SAS, MS Access, Teradata 11, Python
Confidential, Houston, TX
Data Analyst
Responsibilities:
- Worked with the business analysts to understand the project specification and helped them to complete the specification.
- Worked in Data Analysis, data profiling and data governance identifying Data Sets, Source Data, Source Metadata, Data Definitions and Data Formats.
- Used MS Excel, Pivot tables and charts, MS PowerPoint, MS Outlook, MS Communicator and User Base to perform responsibilities.
- Extracted Data using SSIS from DB2, XML, Excel and flat files perform transformations and populate the data warehouse
- Performed Teradata, SQL Queries, creating Tables, and Views by following Teradata Best Practices.
- Prepared Business Requirement Documentation and Functional Documentation.
- Primarily responsible for coordinating between project sponsor and stake holders.
- Conducted JAD sessions to allow different stakeholders such as editorials, designers, etc.
- Extensively SQL experience in querying, data extraction and data transformations.
- Experienced in developing business reports by writing complex SQL queries using views, macros, volatile and global temporary tables.
- Developed numerous reports to capture the transactional data for the business analysis.
- Collaborated with a team of Business Analysts to ascertain capture of all requirements.
- Involved in writing complexSQL queries using correlated sub queries, joins, and recursive queries.
- Designed reports in Excel using advanced functions not limited to pivot tables, formulas
- Used SQL, PL/SQL to validate the Data going in to the Data warehouse
- Wrote complex SQL, PL/SQL testing scripts for Backend Testing of the data warehouse application. Expert in writing Complex SQL/PLSQL Scripts in querying Teradata and Oracle.
- Extensively tested the Business Objects report by running the SQL queries on the database by reviewing the report requirement documentation.
- Implemented the Data Cleansing using various transformations.
- Used Data Stage Director for running and monitoring performance statistics.
- Designed and implemented basic s for testing and report/ data validation.
- Ensured the compliance of the extracts to the Data Quality Center initiatives.
- Wrote multiple SQL queries to analyze the data and presented the results using Excel, and Crystal reports.
Environment: MS Excel, DB2, XML, SQL, PL/SQL