- Over 8 years of experience in the data field working as a Data Analyst/Data Modeler designing and developing OLTP systems, Enterprise Data Warehouse, Data Mart and OLAP systems.
- Experience in SDLC & Agile/SCRUM project methodologies in Analysis, Design, Development, Testing and Implementation of Client/Server - based applications.
- Experience in Data analysis, profiling, migration and conversion. Broad knowledge of Data Governance and Integration.
- Experience in creating data mapping specifications and interface documents.
- Experience in writing T-SQL queries, dynamic queries, sub-queries and complex joins for generating complex Stored Procedures, Triggers, User-Defined Functions, Views, Constraints (Primary key, Foreign key, Unique key) and cursors.
- Excellent understanding of Data Warehousing Concepts - Star and Snowflake schema, slowly changing dimensions types, Normalization (1NF, 2NF, 3NF and BCNF)/De-Normalization, Dimension and Fact tables.
- Strong Experience in implementing Data warehouse solutions in Redshift. Worked on various projects to migrate data from on premise databases to Redshift, RDS and S3.
- Experience in Database Administration operations like Backup, Recovery, Replication and using SQL Profiler and excellent knowledge of DDL and DML in SQL.
- Experience in modeling Transactional Databases and Data Warehouse using tools like Erwin, Oracle Designer, Power Designer and ER/Studio.
- Good knowledge on AWS EC2, ELB, EMR, AWS Redshift, DMS, AWS S3, AWS EBS and AWS Lambda
- Solid understanding of Relational Database Systems, Normalization, Star and Snowflake logical and physical data modeling using ERWIN data modeler and MS Visio.
- Experience in Extraction, Transformation and Loading (ETL) processes to acquire and load data from internal and external sources, Import Export and Bulk Insert.
- Experience in performance tuning in tuning stored procedures, Queries using tools like SQL Profiler, Database Tuning Advisor, Execution Plans and Efficient coding standards.
- Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and involved in preparing Test Scenarios, Test Plans, Test Cases, and Test Data.
- Experience in Data Modeling for OLTP and OLAP systems.
- Experience in gathering, managing and documenting Business Requirement Document (BRD) and Functional Requirement Document (FRD).
- Experience in conducting Joint Application Development (JAD) sessions.
- Proficient in using UML for Business Process Modeling, Use Cases, Activity Diagrams, Sequence Diagrams, Data Flow Diagrams, Collaboration Diagrams, Class Diagrams, Wireframe Prototypes and documenting them using MS Visio.
- Experience in ETL tools like Informatica, SSIS and reporting tools such as Tableau, MicroStrategy.
- Proficient in Tableau, data visualization tool to analyze and obtain insights into large datasets, create visually compelling and actionable interactive reports and dashboards.
- Creative, self-motivated, quick learner and excellent team player, ability to meet deadlines and work under pressure.
Databases: SQL Server, Oracle, MySQL, Teradata, PostgreSQL
BI tools: Tableau, MicroStrategy
Tools: and Software: Oracle SQL Developer, SQL Server Management Studio, Control-M (Automation and Scheduling), Teradata SQL Assistant, TOAD for oracle, MS Office Suite, MS Visio.
Operating Systems: Windows, Linux, UNIX
Analysis/Modeling Tools: Erwin, ER Studio, Informatica Data Quality, Informatica Data Explorer
Cloud: AWS, Snowflake, Redshift
ETL: Informatica, SSIS
Quality Management: HP ALM, Quality Center
Programming Languages: SQL, PL/SQL, R
Confidential, Lowell, AR
Senior Data Analyst/ Senior Data Modeler
- Responsible for the strategic direction of the technical environment, analyzing Data requirements, logical and physical modeling (using Erwin Data Modeling tool), and establishing data relationship with business.
- Generating weekly, monthly reports for various business users according to the business requirements.
- Worked withdatasets of varying degrees of size and complexity including both structured and unstructureddata.
- Successfully used Agile/Scrum Method for gathering requirements and facilitated user stories workshop. Followed Agile Methodology to analyze and design business needs in an iterative manner.
- Involved in creating snow SQL to extract data from S3 buckets tosnowflaketables and transforming the data according to business requirements.
- Conducted meticulous GAP analysis while successfully reengineering keybusiness processes to increase operational efficiency and alignment ofbusinessunit objectives
- Provided visual/graphic analysis based ondatacontent. Responsible fordatavalidation.
- Upgraded, performed source-to-target mappings, storage capacity planning and developingETL.
- Worked on development ofdatawarehouse and ETL systems using relational and non-relational tools like SQL. Developed and maintained ETL mappings to extract thedatafrom multiple source systems like Oracle, SQL server.
- ConductedDataMining andDataModelling. Collecting, cleansing and providing data modelling and analyzing the structured and unstructureddata.
- Responsible for Analyzing, review, modify and develop ETL packages using Informatica.
- Designeddataflows that (ETL) extract, transform, and loaddataby optimizing performance.
- WroteSQLscripts to extractdatafrom the tables to validatedataflow. Performed numerousdatapulling requests usingSQLfor analysis.MigratedNoSQLclusters for one datacenter to another without downtime
- Involved in writing Test Cases, procedures, reports and approval of software release.
- Defined relationships, created actions, filters, Level-of-Detail expressions parameters,data blending, hierarchies, calculated fields, sorting, groupings, live connections, in-memory in both tableau and excel. Created Tables, Views, Indexes and otherSQLjoins.
Confidential, Newport Beach, CA
Data Analyst/Business Analyst
- Created dimensional model for the reporting system by identifying required dimensions and facts using ER Studio. Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model
- Conducted interviews, meetings and JAD sessions during the process of Requirement Gathering.
- Planned and documented procedures for data processing and prepared data flow diagrams for the application.
- Translated business requirements into conceptual, logical and physical data models. DDLs are then generated from CA ERWIN Data Modeling Tool and provided to DBA to apply in the physical database
- Involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Data Stage.
- Designed the data flow process to extract the data from Data Warehouse and load it into Redshift.
- Documented all data mapping and transformation processes in the Functional Design documents based on the business requirements.
- Performed SSIS ETL packages validation, reference Data Management and testing master data match groups, match results based on match scores.
- Worked on AWS Data Pipeline to configure data loads from S3 to into Redshift.
- Performed numerousdatapulling requests using SQL server and Tableau for analysis.
- Responsible for designing logical and physical data modeling for various data sources on Redshift.
- Cleaned, processed and loaded data through ETL and SSIS systems and queries in SQL Server, presented in MS Excel, Tableau report or PDF.
- Created new database objects like Tables, Procedures, Functions, Indexes and Views
- Used the BPMN (Business Process Model and Notation) in GAP Analysis to bridge the gap between business processes (BR's and FR's).
- Analyzed business requirements and segregated them into Use Cases.
- Created Use case diagrams and documented Use case specifications.
- Worked on data profiling and data validation to ensure the accuracy of the data between the warehouse and source systems.
- Verified and analyzed data mapping issues to find effective fixes
- Utilized SQL Server Integration Services for daily data extractions from vendors and load into SQL Server database
- Worked closely with Business Intelligence Analyst to complete reporting tasks, monitor automated SSIS jobs, and analyze data
- Gathered, analyzed, and documented data requirements for projects of medium to high complexity and moderate to high risk, performed source system data quality analysis.
Confidential, Denver, CO
Senior Data Modeler
- Developed the logical data models and physical data models that capture current state/future state data elements and data flows using ER Studio.
- Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata
- Used efficientDataSteps to access and manipulatedatafrom different sources and used proc step to generate report out of them.
- UsedInformatica to extractdatafrom different databases like Teradata, Oracle, MS ACCESS, SQL Server.
- Involved in maintaining and updating Metadata Repository with details on the nature and use of applications/data transformations to facilitate impact analysis. Created DDL scripts using ER Studio and source to target mappings to bring the data from source to the warehouse.
- Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, MS SQL Server with high volume data
- Designed Logical & Physical Data Model /Metadata/ data dictionary using Erwin for both OLTP and OLAP based systems.
- Involved in meetings with SME (subject matter experts) for analyzing the multiple sources.
- Involved in SQL queries and optimizing the queries in Teradata.
- Created DDL scripts using ER Studio and source to target mappings to bring the data from source to the warehouse.
- Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
- Worked on data mapping process from source system to target system. Created dimensional model for the reporting system by identifying required facts and dimensions using Erwin.
- Resolved the data inconsistencies between the source systems and the target system using the Mapping Documents and analyzing the database using SQL queries
Data Analyst/Data Modeler
- Created Business requirement document (BRD), functional requirements specifications (FRS), and technical requirement specification along with Use Case for application development.
- Conducted Interviews, brainstorming and focus groups to identify requirements and then documented them in a format that can be reviewed and understood by both business and developers.
- Facilitated (JAD) Joint Application Development sessions to resolve issues relating to difference between business requirements and developers.
- Performed Data Analysis and Data validation by writing complex SQL queries.
- Created Business Requirements and converted them into detailed Use Cases, Report Specifications and Non-Functional Requirements.
- Analyzed Requirements and created Use Cases, Use Case Diagrams, Activity Diagrams, sequence diagram using MS-Visio.
- Provided primary liaison between business/operations and development.
- Created data models and data flow diagrams for the to-be process.
- Researched upstream data sources and ensured all source data sources were attested by its owners.
- Created and executed SQL queries to validate data movement and generate expected results for UAT.
- Conducted UAT to verify whether all the Requirements were provided to by the application.
- Worked with the developers on resolving the reported bugs and various technical issues.
- Interacted with the end users to identify and gather business requirements and transform them into technical requirements.
- Identified and prepared Use Cases, BPM, and Activity Diagrams using Rational Rose according to UML methodology.