Sr. Data Architect Resume
Reston, VA
SUMMARY:
- Over 11+ years Senior Data Architect/Data Modeler/Data Analyst of experienced in Data Analysis and modeling, architecture, with excellent understanding of Data Warehouse, Databases, Data Governance, and Data Mart designing.
- Experienced in Database Creation and maintenance of physical data models with Oracle, Teradata, Netezza, DB2 and SQL Server databases.
- Experienced in Teradata, SQL queries, Teradata, indexes, Utilities such as Mload, Tpump, Fast load and Fast Export.
- Experienced in Data Modeling retaining concepts of RDBMS, Logical and Physical Data Modeling until 3NormalForm (3NF) and Multidimensional Data Modeling Schema (Star schema, Snow - Flake Modeling, Facts and dimensions).
- Experienced in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and from RDBMS to HDFS.
- Experienced using ER diagram, Dimensional data modeling, Logical/Physical Design, Star Schema modeling, Snow-flake modeling using tools like Erwin and ER/Stuido
- Experience in designing, building and implementing complete Hadoop ecosystem comprising of Map Reduce, HDFS, Hive, Impala, Pig, Sqoop, Oozie, HBase, MongoDB, and Spark.
- Extensive experience in Normalization (1NF, 2NF, 3NF and BCNF) and De-normalization techniques for improved database performance in OLTP and Data Warehouse/Data Mart environments.
- Extensive experience in developing and driving strategic direction of SAP operating system (SAP ECC) and SAP business intelligence (SAP BI) system
- Worked on Hadoop ecosystem, hive queries, MongoDB, Cassandra, Pig, Apache Strom.
- Experienced in multiple data sources and targets including Oracle, Netezza, and DB2, SQL Server, XML and flat files.
- Experience in working with Business Intelligence and Enterprise Data Warehouse(EDW) including SSAS, Pentaho, Cognos, OBIEE, QlikView, Greenplum, Amazon Redshift and Azure Data Warehouse
- Expertise in designing the data warehouse using Ralph Kimball's and Bill Inmon techniques.
- Experienced in Netezza Administration Activities like backup/restore, performance tuning, and Security configuration.
- Experienced in Data Analysis on Oracle, MS SQL Server & MS Access with extraction of data from various database sources like Oracle, MS SQL Server, DB2 and Flat files into the Data Stage.
- Experienced in metadata from diverse sources, including relational databases Oracle, Teradata, Netezza, XML and flat files.
- Experienced in working with Teradata Utilities like Fast load, Multi load, Tpump and Fast Export Teradata Query Submitting and processing tools like Bteq and Teradata SQL Assistant (Queryman).
- Experienced in SSIS programming module to code several SSIS packages by using many programming languages.
- Experienced with Data Conversion, Data Quality, and Data Profiling, Performance Tuning and System Testing and implementing RDBMS features.
- Experienced in data from various sources like Oracle, Mainframes, and flat files and loaded into the target Netezza database.
- Experienced in Client-Server application development using Oracle PL/SQL, SQL PLUS, SQL Developer, TOAD, SQL LOADER.
- Well Versed with advance concept in Excel. Worked with Vlookup, Index, Match, IF Statements, Pivots and creating complex formulas
- Expert in Migration of data from Oracle, Sybase and DB2 to SQL Server.
- Experienced in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, and Metadata Manger), Power Exchange, and Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
- Extensive experience in SSIS Packages, SSRS reports and SSAS cubes on production server.
- Experienced in using Excel and MSAccess to dump the data and analyze based on business needs.
- Experienced working in Postgres database on simple queries and writing Stored Procedures for Normalization and De-normalization.
- Has good knowledge in UI Developer which includes HTML, CSS, JAVASCRIPT, JQUERY and ANGULARJS.
- Experienced in Data Analysis, Data Modeling, Development, Testing and Documentation of projects.
- Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies.
- Experienced in ETL process using PL/SQL to populate the tables in OLTP and OLAP Data Warehouse Environment.
- Expertise in SQL Server Analysis Services (SSAS) to deliver Online Analytical Processing (OLAP) and data mining functionality for business intelligence applications.
TECHNICAL SKILLS:
Data Modeling Tools: Erwin r9.6/r9.5/r9.1/8.X, ER Studio and Oracle Designer, Power Designer.
OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9.
Oracle: Oracle 12c/11g/10g/9i, R2 database servers with RAC, ASM, Data Guard, Grid Control and Oracle Golden Gate(Oracle Enterprise Manager),Oracle Data Guard, SQL* Net, SQL Loader and SQL*PLUS, AWR,ASH, ADDM, Explain Plan.
ETL Tools: SSIS, Datastage, Informatica Power Center 9.7/9.6 /9.5/ 9.1.
Programming Languages: Java, Base SAS and SAS/SQL, SQL, T-SQL, HTML, Java Script, CSS, UNIX shells scripting, PL/SQL.
Database Tools: Microsoft SQL Server 2014/2012/2008/2005, Teradata and MS Access, Postger SQL, Netezza, MySQL, SQL Server, Oracle.
Web technologies: HTML, DHTML, XML, JavaScript
Reporting Tools: Business Objects, Crystal Reports
Operating Systems: Microsoft Windows 9x / NT / 2000/XP / Vista/7 and UNIX Windows 98, 95, Windows NT, Windows XP, 7.
Tools: & Software: TOAD7.1/6.2, MS Office, BTEQ, Teradata SQL Assistant
Big Data: Hadoop, HDFS 2, Hive, Pig, H Base, Sqoop, Flume.
Other tools: TOAD, SQL *PLUS, SQL*LOADER, MS Project, MS Visio and MS Office, Have worked on C++, UNIX, PL/SQL etc.
PROFESSIONAL EXPERIENCE:
Sr. Data Architect
Confidential,Reston VA
Responsibilities:- Provide data architecture support to enterprise data management efforts, such as the development of the enterprise data model and master and reference data, as well as support to projects, such as the development of physical data models, data warehouses and data marts.
- Gathered business requirements, working closely with business users, project leaders and developers. Analyzed the business requirements and designed conceptual and logical data models.
- Lead the strategy, architecture and process improvements for data architecture and data management, balancing long and short-term needs of the business.
- Building relationships and trust with key stakeholders to support program delivery and adoption of enterprise architecture.
- Providing technical leadership, mentoring throughout the project life-cycle, developing vision, strategy, architecture and overall design for assigned domain and for solutions.
- Assumed leadership role in various divisions of Data Warehouse group such as the Business Analysis (group that defines the data transformation rules), the database architecture (the group that defines the logical and physical architecture), the ETL (with Data stage as the platform) and Business Intelligence (Reporting).
- Used Python for developing Spark code for faster processing of data on Hive and used Spark streaming to divide streaming data into batches as an input to spark engine for batch processing.
- Worked on Dimensional and Relational Data Modeling using Star and Snowflake Schemas, OLTP/OLAP system, Fact and Dimension tables, Conceptual, Logical and Physical data modeling using Erwin r9.6.
- Played key role in defining all aspects of Data Governance - data architecture, data security, master data management, data archival & purging and metadata.
- Performed PoC for Big data solution using Cloudera Hadoop for data loading and data querying
- Involved in T-SQL queries and optimizing the queries in Oracle 12c, SQL Server 2014, DB2, and Netezza, Teradata.
- Worked on migrating of EDW to AWS using EMR and various other technologies
- Created MDM, OLAP data architecture, analytical data marts, and cubes optimized for reporting.
- Involved in Logical modeling using the Dimensional Modeling techniques such as Star Schema and Snow Flake Schema.
- Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
- Developed LINUX Shell scripts by using NZSQL/NZLOAD utilities to load data from flat files to Netezza database.
- Developed scripts in Python (Pandas, Numpy) for data ingestion, analyzing and data cleaning
- Worked in Dimension Data modeling concepts like Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.
- Collecting large amounts of log data using Apache Flume and aggregating using PIG/HIVE in HDFS for further analysis..
- Developed logical & physical data model using data warehouse methodologies, including Star schema - Star-joined schemas, conformed dimensions data architecture, early/late binding techniques, data modeling, designing & developing ETL applications using Informatica Power Center.
- Created data models for AWS Redshift and Hive from dimensional data models and worked on Data modeling, Advanced SQL with Columnar Databases using AWS and driven the technical design of AWS solutions by working with customers to understand their needs.
- Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL-like access on Hadoop data.
- Worked on TERADATA15 and utility domains. Optimization of Queries in a Teradata database environment and worked in using Teradata tools like Fast Load, Multi Load, T Pump, Fast Export, Teradata Parallel Transporter (TPT) and BTEQ.
- Performed data analysis, statistical analysis, generated reports, listings and graphs using SAS tools, SAS Integration Studio, SAS/Graph, SAS/SQL, SAS/Connect and SAS/Access.
- Worked in importing and cleansing of data from various sources like Teradata 15, Oracle, flat files, SQL Server with high volume data.
- Successfully loaded files to Hive and HDFS from Oracle and Involved in loading data from UNIX file system to HDFS and involved in the validation of the OLAP Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
- Involved in creating informatica mapping to populate staging tables and data warehouse tables from various sources like flat files DB2, Netezza and oracle sources.
- Full life cycle of Data Lake, Data Warehouse with Big data technologies like Spark, Hadoop, Cassandra and developed enhancements to MongoDB architecture to improve performance and scalability and worked with MapReduce frameworks such as Hadoop and associated tools (pig, Sqoop, etc)
- Developed Data Mapping, Data profiling, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
- Managed definition and execution of data mapping, conversion and reconciliation processes, for data originating from a plethora of enterprise and SAP, leading into to ongoing data governance organization design.
- Converted existing reports and dashboards from Tableau and Qlikview to MicroStrategy.
- Performed data analysis and data profiling using complex SQL queries on various sources systems including Oracle, Teradata and Netezza.
Environment: Erwinr9.6, Oracle 12c, Teradata15, Netezza, PL/SQL, T-SQL, MDM, BI(Tableau), DB2, SQL Sever2014, Informatica Power Center, SQL, Bigdata, Hadoop, Hive Queries, Microstrategy, MapReduce, Pig, Cassandra, MongoDB, SAS, Spark, SSRS, SSIS, SSAS, AWS, S3, Redshift, EMR, Tableau Excel, MS Access, SAP etc.
Sr. Data Architect
Confidential, Chicago IL
Responsibilities:- Developing strategies for data acquisitions, archive recovery, and implementation of databases and working in a data warehouse environment, which includes data design, database architecture, and Metadata and repository creation
- Participated in the design, development, and support of the corporate operation data store and enterprise data warehouse database environment.
- Documented a whole process of working with Tableau Desktop, installing Tableau Server and evaluating Business Requirements.
- Handled importing of data from various data sources, performed transformations using Hive, Map Reduce, loaded data into HDFS and Extracted the data from My SQL into HDFS using Sqoop
- Implemented dimension model (logical and physical data modeling) in the existing architecture using ER/Studio.
- Updated Python scripts to match training data with our database stored in AWS Cloud Search, so that we would be able to assign each document a response label for further classification.
- Involved in Database using Oracle, XML, DB2, Teradata 14.1, Netezza, SQL server, Big Data and NoSQL based MongoDB and Cassandra.
- Developed, managed and validated existing Data Models including Logical and Physical Models of the Data Warehouse and source systems utilizing a 3NFmodel.
- Gathering the business requirements from customers and creating data models for different branches using MS access and ER/Studio.
- Extracted the data from MySQL, AWS RedShift into HDFS using Sqoop.
- Designed Source to Target mapping from primarily Flat files, SQL Server, Oracle 11g, Netezza using Informatica Power Center9.6.
- Worked on Normalization and De-Normalization techniques for both OLTP and OLAP systems.
- Designed Logical data model and Physical Conceptual data documents between source systems and the target data warehouse.
- Worked with Hadoop ecosystem covering HDFS, HBase, YARN and Map Reduce.
- Extensively used ER/Studio for developing data model using star schema methodologies.
- Worked on Data Modeling using Dimensional Data Modeling, Star Schema/Snow Flake schema, and Fact & Dimensional, Physical & Logical data modeling.
- Used External Loaders like Multi Load, T Pump and Fast Load to load data into Teradata Database analysis, development, testing, implementation and deployment.
- Developed requirements, perform data collection, cleansing, transformation, and loading to populate facts and dimensions for data warehouse
- Created, managed, and modified logical and physical data models using a variety of data modeling philosophies and techniques including Inmon or Kimball
- Maintaining data mapping documents, business matrix and other data design artifacts that define technical data specifications and transformation rules
- Developed Map Reduce jobs in java for data cleaning and preprocessing and developed UDFs in Java as and when necessary to use in PIG and HIVE queries.
- Managed the Master Data Governance queue including assessment of downstream impacts to avoid failures
- Worked in the capacity of ETL Developer (Oracle Data Integrator (ODI) / PL/SQL) to migrate data from different sources in to target Oracle Data Warehouse.
- Created and Configured Workflows, Work lets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager.
- Worked with high volume datasets from various sources like SQL Server 2012, Oracle, DB2, and Text Files.
- Created named sets, calculated member and designed scope in SSAS, SSIS, SSRS.
- Worked on Teradata SQL queries, Teradata Indexes, MDM Utilities such as Mload, Tpump, Fast load and Fast Export.
- Migrated SQL server 2008 to SQL Server 2014 in Microsoft Windows Server 2003 and troubleshooting high availability scenarios involving Clustering, Database Mirroring, Log Shipping and Replication.
- Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
- Proving Support to Automation Team in architecting MySQL Database Environments for different projects.
- Develop Power BI reports & Effective dashboards after gathering and translating end user requirement and designed and developed some sophisticated dashboard reports using Power Pivot & Power View, Power BI.
- Involved in extensive Data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Involved in Troubleshooting and quality control of data transformations and loading during migration from Oracle systems into Netezza EDW.
- Created SSIS packages to load data from different sources such as Excel, Flat file, DB2 to SQL server Data warehouse and SQL Server, PL/SQL Transactional database.
Environment: ER/Studio, SSIS, SSRS, SAS, Netezza, Excel, MDM, PL/SQL, ETL, Tableau, Hadoop, Hive, Pig, Mongo DB, Aginity, Teradata SQL Assistant, Cassandra, AWS, EMR, Power BI, MySQL, AWS Redshift, PL/SQL, T-SQL, Cognos, DB2, Oracle11g, SQL, Teradata14.1, Informatica Power Center9.6 etc.
Sr. Data Modeler/Data Architect
Confidential
Responsibilities:- Created conceptual, logical and physical relational models for integration and base layer; created logical and physical dimensional models for presentation layer and dim layer for a dimensional data warehouse in Erwin 9.5.
- Involved in reviewing business requirements and analyzing data sources form Excel/Oracle SQL Server for design, development, testing, and production rollover of reporting and analysis projects.
- Analyzing, designing, developing, implementing and maintaining ETL jobs using IBM Info sphere Data stage and Netezza.
- Extensively worked in Client-Server application development using Oracle 10g, Teradata 14, SQL, PL/SQL, Oracle Import and Export Utilities.
- Coordinated with DB2 on database build and table normalizations and de-normalizations.
- Conducted brain storming sessions with application developers and DBAs to discuss about various de-normalization, partitioning and indexing schemes for Physical Model.
- Involved in several facets of MDM implementations including Data Profiling, metadata acquisition and data migration.
- Extensively used SQL Loader to load data from the Legacy systems into Oracle databases using control files and used Oracle External Tables feature to read the data from flat files into Oracle staging tables.
- Involved in extensive Data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Used SSIS to create ETL packages to validate, extract, transform and load data to data warehouse databases, data mart databases, and process SSAS cubes to store data to OLAP databases
- Strong understanding of Data Modeling (Relational, dimensional, Star and Snowflake Schema), Data analysis, implementations of Data warehousing using Windows and UNIX.
- Extensively worked with Netezza database to implement data cleanup, performance tuning techniques.
- Designed automated reports through MySQL and Excel to reduce manual work.
- Created ETL packages using OLTP data sources (SQL Server 2008, Flat files, Excel source files, Oracle) and loaded the data into target tables by performing different kinds of transformations using SSIS.
- Migrated SQL server 2008 to SQL Server 2008 R2 in Microsoft Windows Server 2008 R2 Enterprise Edition.
- Developing reusable objects like PL/SQL program units and libraries, database procedures and functions, database triggers to be used by the team and satisfying the business rules.
- Responsible to develop advance PL/SQL packages, procedures, triggers, functions, Collections to implement business logic.
- Automated the process of reporting which indicated metrics on profitability of different customer groups, profitability of different using SAS/BASE, SAS/MACRO, SAS/STAT, and SAS/GRAPH.
- Performed data validation on the flat files that were generated in UNIX environment using UNIX commands as necessary.
- Created jobs, alerts to run SSIS, SSRS packages periodically. Created the automated processes for the activities such as database backup processes and SSIS, SSRS Packages run sequentially using SQL Server Agent job and windows Scheduler.
- Created packages, procedures, functions, created tables, indexes, constraints, PL/SQL tables, sequences, synonyms and views.
- Worked on logical data models, recognize source tables to build MicroStrategy schema objects including Attributes, Facts, Hierarchies and Relationships.
- Implemented slowly changing dimensions Type2 and Type3 for accessing history of reference data changes.
- Worked with NZ Load to load flat file data into Netezza, DB2 and Architect to identify proper distribution keys for Netezza tables.
Environment: Erwin 9.5, Teradata14, Oracle10g, PL/SQL, MDM, Tableau, SQL Server 2008, ETL, Netezza, DB2, SSIS, SSRS, SAS, SPSS, Datastage, Informatica, SQL, T-SQL, UNIX, MySQL, Netezza, Aginity, MicroStrategy SQL assistance etc.
Sr. Data Modeler/Data Analyst
Confidential
Responsibilities:- Worked with Business users during requirements gathering and business analysis to prepare high level Logical Data Models and Physical Data Models.
- Performed Reverse Engineering of the current application using ER/Studio and developed Logical and Physical Data Models for Central Model consolidation.
- Involved in integration of various relational and non-relational sources such as DB2, Teradata 13.1, Oracle 9i, SFDC, Netezza, SQL Server, COBOL, XML and Flat Files.
- Involved in Normalization /De-normalization, Normal Form and database design methodology. Expertise in using data modeling tools like MS Visio and ER/Studio Tool for logical and physical design of databases.
- Worked on Key performance Indicators (KPIs), design of star schema and snowflake schema in Analysis Services (SSAS).
- Experienced working on DB2 database for ETL operations. Moved database objects from DB2 to SQL Server Database.
- Created mappings using pushdown optimization to achieve good performance in loading data into Netezza.
- Performed Data Modeling, Database Design, and Data Analysis with the extensive use of ER/Studio.
- Involved in performance tuning and unit testing for SQL and PL/SQL code.
- Used several Oracle provided packages such as UTL FILE, DBMS JOB.
- Developed scripts for loading data from various systems using UNIX Shell, Oracle, PLSQL, SQL*Loader.
- Implement migration of SQL Server databases and automated Factory capacity report generation with MYSQL, MS Access. Extracted monthly marketing data from various sources (flat files, spreadsheets, etc)
- Documented ER Diagrams, Logical and Physical models, business process diagrams and process flow diagrams.
- Created reports in Oracle Discoverer by importing PL/SQL functions on the Admin Layer, in order to meet the sophisticated client requests.
- Extensively used SQL, Transact SQL and PL/SQL to write stored procedures, functions, packages and triggers.
- Built fully functional dashboards prototypes without going through ETL process and entire schema modeling lifecycle using MicroStrategy Visual Insights.
- Created tables, views, sequences, indexes, constraints and generated SQL scripts for implementing physical data model.
- Data mapping documents between Legacy, Production, and User Interface Systems were developed.
- Star schema was developed for proposed central model and normalized star schema to snow flake schema.
- Deploying monthly updates to Power BI Desktop and Power BI Report Server applications and deploying Power BI Reports into both Platforms Report servers and Cloud Versions.
- Created Physical Data Model from the Logical Data Model using Compare and Merge Utility in ER/Studio and worked with the naming standards utility.
- Involved in implementing the Land Process of loading the customer Data Set into Informatica Power Center, MDM from various source systems
- Worked with mapping parameters, variables and parameter files and designed the ETL to create parameter files to make it dynamic.
- Maintain existing MySQL environments, including large replication environments for data mining and mid-sized and migration of databases from DB2 to SQL Server 2005, 2008.
- Tuning and code optimization using different techniques like dynamic SQL, dynamic cursors, tuning SQL queries, writing generic procedures, functions and packages
- Created and Configured Workflows, Work lets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Work flow Manager.
- Designing Conceptual, Logical, Extended Logical and Physical data models in Oracle database environments.
- Extensively worked on Shell scripts for running SSIS programs in batch mode on UNIX.
Environment: ER/Studio, Teradata 13.1, SSIS, SAS, Excel, T-SQL, SSRS, Tableau, SQL Server, Cognos, Pivot tables, Graphs, MDM, PL/SQL, ETL, DB2, Oracle 11g/10g, SQL, MySQl, Power BI, UNIX Shell scripting, MicroStrategy, XML, SQL*Loader, SQL*Plus, Informatica Power Center etc.
Confidential
Sr. Data Analyst/Data Modeler
Responsibilities:- Created and maintained Logical and Physical models for the data mart and created partitions and indexes for the tables in the data mart.
- Performed data profiling and analysis applied various data cleansing rules designed data standards and architecture/designed the relational models.
- Maintained metadata (data definitions of table structures) and version controlling for the data model.
- Developed SQLscripts for creating tables, Sequences, Triggers, views and materialized views.
- Analyze and study the source system data models to understand concept tie-outs such that the integration process into existing data warehouse is seamless and data redundancy is eliminated.
- Used Teradata OLAP functions like RANK, ROW NUMBER, QUALIFY, CSUM and SAMPLE.
- Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
- Conducted performance analysis and created partitions, indexes and Aggregate tables.
- Utilized ER Studio forward/reverse engineering tools and target database schema conversion process.
- Developed SQL scripts for loading the aggregate tables and rollup dimensions.
- Perform data profiling and data analysis to enable identify data gaps and familiarize with new source system data
- Created required Oracle tables, indexes, Triggers, views, materialized views, global temporary tables, synonyms.
- Involved in working on Curves and amounts related procedures, converted above code to oracle to send the output to front end (.Net framework) and optimized the code by removing temporary tables.
- Performed complex data analysis on VLDB while migrating from Sybase to Oracle.
- Extensively worked on Cursors, Ref Cursors, Collections, Hints, Bulk Collect and PLSQL tables to improve the performance while working with thousands million rows of data.
- Received the Data Warehouse feed files daily, weekly and loaded into Oracle tables using SQL Loader tool.
- Analyze existing source system with the help of Data Profiling and source system data models thus creating individual data models for various domains/subject areas for the proposed data warehouse solution.
- Performed unit testing, system integrated testing for the aggregate tables.
- Performed data analysis on the target tables to make sure the data as per the business expectations.
- Used Normalization methods up to 3NF and De-normalization techniques for effective performance in OLTP systems.
- Developed SQL scripts for loading data from staging area to target tables.
- Proposed the EDW data design to centralize the data scattered across multiple datasets.
- Worked on the development of Data Warehouse, Business Intelligence architecture that involves data integration and the conversion of data from multiple sources and platforms.
- Worked on TeradataSQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and FastExport
- Worked on SQL and SAS script mapping.
- Follow the Type 2-dimension methodology to accommodate designing and maintaining for history data.
- Used Meta data tool for importing metadata from repository, new job categories and creating new data elements.
- Research on the attained database space savings as and when a module has been released and come out with numbers before and after the release.
- Performed the Data Mapping, Data design (Data Modeling) to integrate the data across the multiple databases in to EDW.
Environment: Oracle 10g/11g, SQL Plus, ER Studio, MS Visio, Source Offsite (SOS), Windows XP, QC Explorer, Share point workspace, Teradata, Oracle, SQL, PL/SQL, IBM DB2,Business Objects XI3.5,COBOL,QuickData, PL/SQL, Oracle11g/10g, UNIX Shell scripting, Linux, Sybase, Perl, Visio tools, .Net Framework, C#, TOAD, SQL*Loader, SQL*Plus.