We provide IT Staff Augmentation Services!

Sr. Data Architect/data Modeler Resume

Wayne, PA

SUMMARY:

  • Above 9+ years of experience as Data Architect/Data Modeler and Data Analyst in Data Analysis, Data Modeling, Data Architecture, designing, developing, and implementing data models for enterprise - level applications and systems.
  • Experience in BI/DW solution (ETL, OLAP, Data mart), Informatica, BI Reporting tool like Tableau and QlikView and also experienced leading the team of application, ETL, BI developers, Testing team
  • Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
  • Strong experience with different project methodologies including Agile Scrum Methodology and Waterfall methodology.
  • Strong experience in Normalization (1NF, 2NF, 3NF and BCNF) and Denormalization techniques for effective and optimum performance.
  • Good understanding and hands on experience in setting up and maintaining NoSQL Databases like Cassandra, MongoDB, and HBase
  • Expertise in Database Performance Tuning using Oracle Hints, Explain plan, TKPROF, Partitioning and Indexes
  • Extensive experience on usage of ETL &Reporting tools like SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS)
  • Experience working with data modeling tools like Erwin, Power Designer and ER Studio.
  • Experienced in various Teradata utilities like Fastload, Multiload, BTEQ, and Teradata SQL Assistant.
  • Excellent experience in creating cloud based solutions and architecture using Amazon Web services(Amazon EC2, Amazon S3, Amazon RDS) and Microsoft Azure.
  • Excellent Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling.
  • Experience in analyzing data using Hadoop Ecosystem including HDFS, Hive, Spark, Spark Streaming, Elastic Search, Kibana, Kafka, HBase, Zookeeper, PIG, Sqoop, Flume.
  • Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Experience with JAD sessions for requirements gathering and writing functional specifications, queries.
  • Expertise in Data Management Data Governance, Data Integration, Metadata, Reference Data and MDM
  • Extensive experience in development of T-SQL, Oracle PL/SQL Scripts, Stored Procedures and Triggers for business logic implementation.
  • Extensive experience with OLTP/OLAP System and E-R modeling, developing Database Schemas like STAR schema and Snowflake schema, FACT & Dimension tables used in relational, dimensional and multidimensional modeling.
  • Experience in Agile/Scrum, iterative and waterfall SDLC methodologies.
  • Experience as a Architect UML models and leverage the advanced executable code generators to target different domains.
  • Experience in integration of various relational and non-relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL database.
  • Experienced in JIRA software for Plan, Track, and Report and Release management.
  • Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
  • Hands on experience with Forward & Reverse Engineering approach to redefine entities, relationships and attributes in the data model.
  • Excellent understanding of Hub Architecture Style for MDM hubs the registry, repository and hybrid approach.
  • Experienced in understanding the ETL framework metadata to understand the current state ETL implementation.

TECHNICAL SKILLS:

Data Modeling Tools: Erwin 9.6/9.5, ER/Studio 9.7/9.0, Sybase Power DesignerBig Data Technologies: Hadoop, Hive, HDFS, HBase, Flume, Sqoop, Spark, Pig, Impala, MapReduce.

Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2

Operating Systems: Microsoft Windows 8.1/7 and UNIX & Linux.

ETL/Data warehouse Tools: Informatica 9.6/9.1, Tableau

Project Execution Methodologies: Agile, Ralph Kimball and BillInmon data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD)

ETL Tools: SSIS, SSRS, Informatica Power 9.6

Reporting Tools: Business Objects, Crystal Reports8/9.0

Languages: Unix Shell Scripting, HTML, T-SQL, Data Structure, Algorithms.

Tools: & Utilities: TOAD 9.6, Microsoft Visio 2010, MS office 2010/2007

BI Tools: Tableau, Tableau server, Tableau Reader

PROFESSIONAL EXPERIENCE:

Confidential - Wayne, PA

Sr. Data Architect/Data Modeler

Responsibilities:

  • Working as a Data Modeler/Architect to generate Data Models using Erwin and developed relational database system.
  • Also involved in Data Architect role to review business requirement and compose source to target data mapping documents.
  • Researched, evaluated, architect, and deployed new tools, frameworks, and patterns to build sustainable Big Data platforms for our clients
  • Designed the Logical Data Model using ERWIN 9.64 with the entities and attributes for each subject areas.
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Designed of Big Data platform technology architecture. The scope includes data intake, data staging, data warehousing, and high performance analytics environment.
  • Involved in loading data from LINUX file system to HDFS Importing and exporting data into HDFS and Hive using Sqoop Implemented Partitioning, Dynamic Partitions, Buckets in Hive.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Specifies overall Data Architecture for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning, ETL, and BI.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Involved in Normalization / Denormalization techniques for optimum performance in relational and dimensional database environments.
  • Performance tuning and stress-testing of NoSQL database environments in order to ensure acceptable database performance in production mode.
  • Implemented strong referential integrity and auditing by the use of triggers and SQL Scripts.
  • Created and managed database objects (tables, views, indexes, etc.) per application specifications. Implemented database procedures, triggers and SQL scripts for development teams.
  • Designed and developed T-SQL stored procedures to extract, aggregate, transform, and insert data.
  • Extracted Mega Data from Amazon Redshift, AWS, and Elastic Search engine using SQL Queries to create reports.
  • Created and maintained SQL Server scheduled jobs, executing stored procedures for the purpose of extracting data from DB2 into SQL Server.
  • Developed SQL Stored procedures to query dimension and fact tables in data warehouse.
  • Performed Hive programming for applications that were migrated to big data using Hadoop
  • Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
  • Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
  • Used SSRS to create reports, customized Reports, on-demand reports, ad-hoc reports and involved in analyzing multi-dimensional reports in SSRS.
  • Involved in designing Logical and Physical data models for different database applications using the Erwin.
  • Data modeling, Design, implement, and deploy high-performance, custom applications at scale on Hadoop /Spark.
  • Executed Hive queries on Parquet tables stored in Hive to perform data analysis to meet the business requirements.
  • Developed Map Reduce programs to cleanse the data in HDFS obtained from heterogeneous data sources to make it suitable for ingestion into Hive.
  • Implemented Data Integrity and Data Quality checks in Hadoop using Hive and Linux scripts.
  • Applied data analysis, data mining and data engineering to present data clearly.
  • Reverse engineered some of the databases using Erwin.
  • Routinely deal in with large internal and vendor data and perform performance tuning, query optimizations and production support for SAS, Oracle 12c.

Environment: CA Erwin 9.64, Oracle 12c, MS-Office, SQL Architect, TOAD Benchmark Factory, SQL Loader, PL/SQL, Redshift, SQL Server 2016, Hive, Pig, Hadoop, Spark, AWS.

Confidential - Chicago, IL

Sr. Data Architect/Data Modeler

Responsibilities:

  • Responsible for technical data governance, enterprise wide data modeling and database design.
  • Developed multi-dimension data models to support BI solutions developed as well as other common industry data from external systems.
  • Working with business partners and team members, gather and analyze requirements translating these into solutions for database designs supporting transactional system data integration, reports, spreadsheets, and dashboards.
  • Involved in Planning, Defining and Designing data base using Erwin on business requirement and provided documentation.
  • Responsible for Big data initiatives and engagement including analysis, brainstorming, POC, and architecture.
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL access on Hadoop data
  • Created Complex SSAS cubes with multiple Fact and Measure groups, and multiple dimension hierarchies based on the OLAP reporting needs.
  • Created SSAS tabular semantic model in Direct Query mode with multiple partitions, KPI's, hierarchies and calculated measures using DAX as per business requirements.
  • Designed facts and dimension tables and defined relationship between facts and dimensions with Star Schema and Snowflake Schema in SSAS.
  • Worked with project management, business teams and departments to assess and refine requirements to design/develop BI solutions using MS Azure.
  • Researched and developed hosting solutions using Azure and other 3rd party hosting and software as a service solution.
  • Created SQL tables with referential integrity and developed SQL queries using SQL Server and Toad.
  • Worked with Azure Machine Learning, Azure Event Hubs, Azure Stream Analytics, PivotTables working with up to 140 million-record, multi-table, data sets in SQL (MS SQL Server, SAS Proc SQL, etc.)
  • Created Tabular Data Models and implemented POWER BI for POC in Share Point Environment
  • Partnered directly with the Data Architect, clients, ETL developers, other technical data warehouse team members and database administrators to design and develop high performing databases and maintain consistent data element definitions
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Created logical and physical data models using Erwin and reviewed these models with business team and data architecture team.
  • Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index considerations.
  • Created SQL scripts to find data quality issues and to identify keys, data anomalies, and data validation issues.
  • Designed different type of STAR schemas for detailed data marts and plan data marts in the OLAP environment.
  • Produced and enforced data standards and maintain a repository of data architecture artifacts and procedures.
  • Provides architectures, patterns, tooling choices and standards for master data and hierarchy life cycle management.

Environment: Erwin 9.6, Informatica v10, Power Pivot, SQL, Microsoft Azure, MS Excel, MS Visio, Rational Rose, SSAS, Pig, Hive, CSV files, XML files, Linux, AWK, Aginity, Teradata SQL Assistant, Oracle12c.

Confidential - Richmond, VA

Sr. Data Analyst

Responsibilities:

  • Analyze the OLTP Source Systems and Operational Data Store and research the tables/entities required for the project.
  • Designing the measures, dimensions and facts matrix document for the ease while designing.
  • Created data flowcharts and attribute mapping documents, analyzed the source meaning to retain and provide proper business names following the very stringent FTB's data standards.
  • Developed several scripts to gather all the required data from different databases to build the LAR file monthly.
  • Developed numerous reports to capture the transactional data for the business analysis.
  • Developed complex SQL queries to bring data together from various systems.
  • Organized and conducted cross-functional meetings to ensure linearity of the phase approach.
  • Collaborated with a team of Business Analysts to ascertain capture of all requirements.
  • Created multiple reports on the daily transactional data which involves millions of records.
  • Used Joins like Inner Joins, Outer joins while creating tables from multiple tables.
  • Created Multi set, temporary, derived and volatile tables in Teradata database.
  • Implemented Indexes, Collecting Statistics, and Constraints while creating tables.
  • Utilized ODBC for connectivity to Teradata via MS Excel to retrieve automatically from Teradata Database.
  • Developed various ad hoc reports based on the requirements
  • Designed & developed various Ad hoc reports for different teams in Business (Teradata and Oracle SQL, MS access, MS excel)
  • Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and formatted the results into reports and kept logs.
  • Involved in writing complex SQL queries using correlated sub queries, joins, and recursive queries.
  • Delivered the artifacts within the time lines and excelled in the quality of deliverables.
  • Validated the data during UAT testing.
  • Performing source to target Mapping
  • Involved in Metadata management, where all the table specifications were listed and implemented the same in Ab Initio metadata hub as per data governance.
  • Developed Korn Shell scripts to parallel extract and process data from different sources simultaneously to streamline performance and improve execution time in a parallel process for better time, resource management and efficiency.
  • Used Teradata utilities such as TPT (Teradata Parallel Transporter), FLOAD (Fastload) and MLOAD (Multiload) for handling various tasks.
  • Developed Logical data model using Erwin and created physical data models using forward engineering.

Environment: Erwin 8.0, Teradata 13, TOAD, Oracle 10g/11g, MS SQL Server 2008, Teradata SQL Assistant, XML Files, Flat files

Confidential - Chicago, IL

Data Modeler/Data Analyst

Responsibilities:

  • Interacted with End user community to understand the business requirements and in identifying data sources.
  • Closely worked with Business Users and acted as the source for IT with the complete business knowledge of the Projects.
  • Gathered business requirements through interviews, surveys and observing from account managers and UI (User Interface) of the existing system.
  • Prepared High level logical data models and BRDs (business required documents) supporting the documents containing the essential business elements and the description between the entities.
  • Played an active role as a member of Project team to provide business data requirements analysis services, producing logical and Physical data models.
  • Developed conceptual models based on requirement analysis.
  • Defined and processed facts and dimensions.
  • Designed the Data Marts using Ralf Kimball's Dimension data mart modeling methodologies using ERWIN.
  • Designed different type of STAR schemas using Erwin with various Dimensions like time, services, customers and FACT tables.
  • Worked with end users to collect business data quality rules and worked with the development team to establish technical data quality rules.
  • Identified the Entities and relationships between the Entities to develop a logical model and later translated the model into physical model.
  • Created Source to Target mappings and Transformations. Mapped data between Source and Targets.
  • Widely used Normalization methods.
  • Coordinated with DBA's and generated SQL code from the data models using.
  • Worked closely with the ETL SQL Server Integration Services (SSIS) Developers to explain the Data Transformation.
  • Supported UAT (User Acceptance Testing) by writing SQL queries.
  • Created SQL tables with referential integrity and developed SQL queries using SQL Server and Toad.
  • Performed extensive data analysis and data validation on Teradata.
  • Worked with the Reporting Analyst and Reporting Development Team to understand Reporting requirements.
  • Involved in Generating ad-hoc reports using crystal reports 9.
  • Designed and Developed Oracle database Tables, Views, Indexes with proper privileges and Maintained and updated the database by deleting and removing old data.

Environment: Microsoft Windows Vista / Unix, ERWIN 8.x, oracle 11g, SQL, MS Excel, SSIS, Teradata, DB2, TOAD from Quest software.

Confidential

Data Analyst/Data Modeler

Responsibilities:

  • Designed ER diagrams, logical model and physical database for Oracle and Teradata as per business requirements using Erwin.
  • Worked on data profiling and data validation to ensure the accuracy of the data between the warehouse and source systems.
  • Performed Data Analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata, in order to ensure accuracy of the data between the warehouse and source systems.
  • Gathered business requirements through interviews, survey with users and business analysts.
  • Prepared functional requirements to define the system rules by creating Use Cases, Interface design and data dictionaries.
  • Analyzed various resource files being used to generate the report and documented data mapping and data source interface documents to assist the development team.
  • Used SQL for querying and analysis purposes on various source tables and conditions applied and Wrote SQL joins, sub queries.
  • Used SQL Server Reporting Services to schedule reports to be generated on predetermined time. Built up and maintain a strong data warehousing system for the organization
  • Involved in business process modeling using UML through Rational Rose.
  • Implement Cognos FM, cube and report development to support business user to identify opportunities for improvement in operations and processes
  • Reverse Engineered DB2 databases and then forward engineered them to SQL Server 2000 using Erwin.
  • Developed logical data model using Erwin and created physical data models using forward engineering in generating DDL scripts and creating indexing strategies
  • De-normalized the database to put them into the star schema of the data warehouse
  • Understood existing data model and documented suspected design affecting the performance of the system.
  • Involved in portioning of cubes in order to increase the performance of cubes (SSAS).
  • Identify the potential customers from the given data base and counsel the organization on ways of converting them to loyal customers as a part of CRM.
  • Worked with Flat Files (Pipe Delimited) sources and implemented error handling routines.

Environment: DB2, SQL Server 2000, Informatica Power Center, Erwin, Microsoft Visio, Rational Requisite Pro, Rational Rose, Windows 2003 Server.

Hire Now