- Having 7+ years of technical experience in Oracle PL/SQL Data Modeling and developed effective and efficient solutions and ensuring client deliverables within committed timelines.
- Experienced Data Modeler with strong conceptual, Logical and Physical Data Modeling skills, Data Profiling skills, Maintaining Data Quality, experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications, queries.
- Experienced in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables.
- Expertise in AWS Resources like EC2, S3, EBS, VPC, ELB, SNS, RDS, IAM, Route 53, Auto scaling, Cloud Formation, Cloud Watch, Security Groups, Redshift, Lambda, Kinesis, Data Pipeline.
- Understanding in development of Conceptual, Logical and Physical Models for Online Transaction Processing and Online Analytical Processing (OLTP&OLAP).
- Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2, Teradata, Hive and AWS.
- Experienced in trouble shooting SQL queries, ETL jobs, Datawarehouse, Data mart data store models.
- Expert in documenting the Business Requirements Document (BRD), generating the UAT Test Plans, maintaining the Traceability Matrix and assisting in Post Implementation activities.
- Enterprise Data Modeler with a deep understanding of developing Enterprise Data Models that strictly meet Normalization Rules, as well as Enterprise Data Warehouses using Kimball and Billmon Data Warehouse Methodologies.
- Knowledgeable in Best Practices and Design Patterns, Cube design, BI Strategy and Design and 3NFModeling.
- Extensively working experience on Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) system environment.
- I have handled performance tuning, conducted backups, and ensure integrity and security of databases managed Postgres DB in the AWS environment and Aurora - Postgres.
- Expert in SQL queries, PL/SQL Packages, SQL Plus, Stored Procedures, Functions, Triggers and Performance Analysis, Creating Partitions, Indexes, Aggregating tables when required.
- Expert in Physical Modeling for multiple platforms such as Oracle/Teradata/ SQL Server/DB2.
- Involved in creating Physical and Logical models using Erwin.
Data modeling/ETL/Data-Pipeline Tools: Erwin, ER Studio, Visual Paradigm, Talend Open Studio, Informatica, Data Stage, SSIS, Apache Flink
Big Data Ecosystem: Hive, Spark (Scala/Python)
Data Ingestion: Sqoop, Flume, NiFi, Kafka
BI Tools: AWS -Quicksight, Power BI, Tibco Spotfire, SSRS, Jasper reports
Scripting Languages: Bash, Pearl, Python, R Language
Databases: Oracle 10g/11g, PostgreSQL 9.3, MySQL, SQL-Server, Teradata
Web Services: IntelliJ, Eclipse, Visual Studio, IDLE
Data-Modeling Methodologies: Object Relational Modeling, ER modeling, Dimensional Modeling-(Kimball/Bill Inmon), Data Vault 2.0
Cloud technologies: AWS Redshift, Kinesis, EMR, Quicksight, Aurora Postgres
Methodologies: Agile, Scrum, Iterative Development, Waterfall Model, UML, Design Patterns
Confidential, Atlanta, Georgia
- Working as a Data Modeler to generate Data Models usingErwin r9.64and developed relational database system (RDBMS).
- Architect, researched, evaluated, and deployed new tools, frameworks, and patterns to build sustainable Big Data platforms for our clients.
- Translated the business requirements into workable functional and non-functional requirements Confidential detailed production level using Workflow Diagrams, Sequence Diagrams, Activity Diagrams and Use Case Modelling.
- Have been working with AWS cloud services (VPC, EC2, S3, RDS, Redshift, Data Pipeline, EMR, DynamoDB, Workspaces, Lambda, Kinesis, RDS, SNS, SQS).
- Involved in creating Physical and Logical models using Erwin.
- Worked on building the data model using Erwin as per the requirements. Designed the grain of facts depending on reporting requirements.
- Involved with Data Analysis Primarily Identifying Data Sets, Source Data, Source Meta Data, Data Definitions and Data Formats.
- Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
- Expert in the Data Analysis, Design, Development, Implementationand Testingusing Data Conversions, Extraction, Transformation and Loading(ETL) and ORACLE, SQL Serverand other relational and non-relational databases.
- Involved inNormalization / Denormalizationtechniques for optimum performance in relational and dimensional database environments.
- Document all data mapping and transformation processes in the Functional Design documents based on the business requirements.
- Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy DB2 andSQL Serverdatabase systems.
- Highly proficient inData Modelingretaining concepts ofRDBMS, Logical andPhysical Data Modelinguntil3NormalForm (3NF)andMultidimensional Data Modeling Schema(Star schema, Snow-Flake Modeling,Facts, and dimensions).
- Used data analysis techniques to validate business rules and identify low quality missing data in the existing data.
- Migrated an existing on-premises application to AWS. Used AWS services like EC2 and S3 for processing and storage, Experienced in Maintaining the Hadoop on AWS EMR.
- Generated ad-hoc SQL queries using joins, database connections and transformation rules to profile datafrom DB2 and SQL Server database systems.
- Worked with data compliance teams, Data governance team to maintain data models, Metadata, Data Dictionaries; define source fields and its definitions.
Confidential, Atlanta, Georgia
- Involved in Data mapping specifications to create and execute detailed system test plans. Thedata mapping specifies whatdatawill be extracted from an internaldatawarehouse, transformed and sent to an external entity.
- Documentedlogical,physical, relational and dimensionaldatamodels. Designed theDataMartsin dimensionaldatamodeling using star and snowflake schemas.
- Prepared documentation for all entities, attributes,datarelationships, primary and foreign key structures, allowed values, codes, business rules, and glossary evolve and change during the project.
- Coordinated withDBAondatabase build and tablenormalizationsandde-normalizations.
- Identified the entities and relationship between the entities to developConceptual Model using ERWIN.
- Developed Logical Model from the conceptual model.
- Responsible for differentDatamappingactivities from Source systems.
- Involved indata modelreviews with internaldata architect, business analysts,and business users with explanation of the data model to make sure it is in-line with business requirements.
- Involved withData Profilingactivities for new sources before creating new subject areas in warehouse.
- Extensively workedData Governance, i.e.,Metadata management, Master data Management, Data Quality, Data Security.
- Redefined many attributes and relationships in thereverse engineeredmodel and cleansed unwanted tables/columns onTeradatadatabase as part of data analysis responsibilities.
- Performed complexdata analysisin support of ad-hoc and standing customer requests.
- Delivereddata solutionsin report/presentation format according to customer specifications and timelines.
- UsedReverse Engineeringapproach to redefine entities, relationships, and attributes in the data model as per new specifications inErwinafter analyzing the database systems currently in use.
- Enforced referential integrity in theOLTP data modelfor consistent relationship between tables and efficient database design.
- Created the test environment for Staging area, loading the Staging area withdatafrom multiple sources.
- Involved inSQL Development,Unit TestingandPerformance Tuningand to ensure testing issues are resolved based on using defect reports.
- Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases.
- Experience in creatingUNIXscripts for file transfer and file manipulation.
- Tested the database to check fieldsize validation, checkconstraints,stored procedures,andcross verifying the field size definedwithin the application withmetadata.
- Defined scope of the project, gathered business requirement document, preformed GAP analysis.
- Implemented Data Lake using Hadoop architecture.
- Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL access on Hadoop data.
- Created rules using IBM Infosphere Master Data Management for the MDM solution.
- Worked on IBM Infosphere Data Architect for creating logical data models.
- Involved in converting the user requirements into business requirements, functional requirements and technical requirements and created business process models from the requirements specs. Managed requirements using DOORS.
- Defined the business logic for the web-services being used for the SOA based application.
- Worked to create PhysicalDataDesigns/ First CutDataModels for various projects/contracts.
- Extensively worked on Performance Tuning for Project usingIBMInfosphere DataStage 8.5.
- Working on differentdataformats such as Flat files, SQL files, Databases, XML schema, CSV files.
- Involved in project cycle plan for the data warehouse source data analysis, data extraction process, transformation and ETL loading strategy designing.
- Involved in running Hadoop streaming jobs to process terabytes of text data. Worked with different file formats such as Text, Sequence files, Avro, ORC and Parquette.
- Used Flume to collect the log data from different resources and transfer the data type toHivetables using different SerDes to store in JSON, XML and Sequence file formats.
- Layer the Architecture designing for the Data Lake.
- Successfully implemented projects using the Data Lake strategies.
- Analyzed system specifications, business requirements for full understanding of the project to comply with corporate rules and regulations.
- Involved in identifying the process flow, the workflow and data flow of the core systems.
- Worked extensively on user requirements gathering and gap analysis.
- Involved in full development cycle of Planning, Analysis, Design, Development, Testing, and Implementation.
- DevelopedPL/SQL triggersandmaster tablesfor automatic creation of primary keys.
- Involved in Data analysis for data conversion included data mapping from source to target database schemas, specification and writing data extract scripts/programming of data conversion, in test and production environments.
- Developed AdvancePL/SQL packages, procedures, triggers, functions, IndexesandCollectionsto implement business logic usingSQL Navigator. Generated server-sidePL/SQL scripts fordata manipulationand validation and materialized views for remote instance.
- UsedSQL Server SSIS toolto build high performance data integration solutions includingextraction, transformation,and load packagesfordata ware housing.Extracted data from theXMLfile and loaded it into thedatabase.
- Designed and developedOracle forms & reportsgenerating up to 60 reports.
- Involved inData loadingandExtractingfunctions usingSQL*Loader.
- Performed Database Administration of all database objects includingtables, clusters, indexes, views, sequences packages andprocedures.
- Designed and developed all thetables, viewsfor the system in Oracle.
- Designing anddeveloping forms validationprocedures for query and update of data.
- Handled errors usingException Handlingextensively for the ease of debugging and displaying the error messages in the application.
- Testing all forms, PL/SQL code for logic correction.