We provide IT Staff Augmentation Services!

Sr. Data Architect Resume

5.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY:

  • Over 11 year of Senior Data Architect/Modeler/Analyst with IT professional experienced in Data Analysis, Data Modeling, Data Architecture, designing, developing, and implementing data models for enterprise - level applications and systems.
  • Experienced in integration of various relational and non-relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, COBOL, XML and Flat Files, to Netezza database.
  • Experienced in designing Star Schema, Snowflake schema for Data Warehouse, by using tools like Erwindatamodeler, Power Designer and Embarcadero E-R Studio.
  • Experienced in big data analysis and developing data models using Hive, PIG, and Map reduce, SQL with strong data architecting skills designing data-centric solutions.
  • Experienced inDatamodeling forDataMart/DataWarehouse development including conceptual,logicaland physicalmodeldesign, developing Entity Relationship Diagram (ERD), reverse/forward engineer (ERD) with CA ERwindatamodeler.
  • Extensive knowledge of big data, Hadoop, Map-Reduce, Hive, NoSQL Databases and other emerging technologies.
  • Experienced in Netezza tools and Utilities NzLoad, NzSql, NzPL/SQL, Sqltoolkits, Analytical functions etc.
  • Extensived in Relational and DimensionalDatamodeling for creating Logical and Physical Design of Database and ER Diagrams using multipledatamodeling tools like Erwin, ER Studio.
  • Very good experience and knowledge on Amazon Web Services: AWS Redshift, AWS S3 and AWS EMR.
  • Experienced in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa.
  • Excellent Knowledge of Ralph Kimball and BillInmon's approaches toDataWarehousing.
  • Experienced in development and support knowledge on Oracle, SQL, PL/SQL,T-SQL queries
  • Experienced in LogicalDataModel(LDM) and PhysicalDataModels(PDM) using Erwindatamodeling tool.
  • Experienced in migration of Data from Excel, Flat file, Oracle to MS SQL Server by using SQL Server SSIS.
  • Experienced in process improvement,Normalization/De-normalization, data extraction, data cleansing, and data manipulation.
  • Experienced in ETL design, development and maintenance using Oracle SQL, PL/SQL, TOAD SQL Loader, and Database Management System (RDBMS).
  • Experienced in Designed and developed Data models for Database (OLTP), the Operational Data Store (ODS), Data warehouse (OLAP), and federated databases to support client enterprise Information Management Strategy.
  • Working experience with Kimball Methodology and Data Vault Modeling
  • Experienced in data from various sources like Oracle Database, Flat Files, and CSV files and loaded to target warehouse.
  • Experienced in Transform, and Load data from heterogeneous data sources to SQL Server using SQL Server Integration Services (SSIS) Packages.
  • Hands on experience on tools like R, SQL, SAS and Tableau
  • Good knowledge Developing Informatica Mappings, Mapplets, Sessionss, Workflows and Worklets for data loads from various sources such as Oracle, Flat Files, DB2, SQL Server etc.
  • Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE Methodologies.
  • Experienced in Business Intelligence (SSIS, SSRS),DataWarehousing and Dashboards.
  • Expertise in source to target mapping in Enterprise and Corporate Data Warehouse environments.
  • Experienced in MDM (Master Data management) in removing duplicates, standardizing data, and to eliminate incorrect data.
  • Experienced in Extracting, Transforming and Loading (ETL) data using SSIS/DTS creating.

TECHNICAL SKILLS:

Analysis and Modeling Tools: Erwin r9.6/r9.5/r9.1/r8.x, Sybase Power Designer, Oracle Designer, BP win ER/Studio, .1, MS Access 2000, Star-Schema,Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables.

OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9.

Oracle: Oracle12c/11g/10g/9i/8.x, R2 database servers with RAC, ASM, Data Guard, Grid Control and Oracle Golden Gate(Oracle Enterprise Manager),Oracle Data Guard, SQL, Net, SQL Loader and SQL PLUS, AWR,ASH, ADDM, Explain Plan.

ETL Tools: SSIS, Pentaho, Informatica Power Center 9.7/9.6/9.5/9.1, Data stage.

Programming Languages: Java, Base SAS, SSIS and SAS/SQL, SQL, T-SQL, HTML/XHTML/HTML4.0.1/HTML3.2, Java Script, CSS3/CSS2/CSS1, UNIX shells scripting, PL/SQL.

Database Tools: Microsoft SQL Server 2014/2012/2008/2005, Teradata, and MS Access, Poster SQL, Netezza, SQL Server, Oracle.

Web technologies: Python, HTML, XHTML, DHTML, XML, JavaScript

Reporting Tools: Business Objects, SSRS, Tableau, Crystal Reports

Operating Systems: Microsoft Windows 9x / NT / 2000/XP / Vista/7 and UNIX Windows 95, 98, Windows NT, Windows XP, 7.

Tools: & Software: TOAD, MS Office, BTEQ, Teradata 15/14.1/14/13.1/13, SQL Assistant

Big Data: Hadoop, HDFS 2, Hive, Pig, HBase, Sqoop, Spark.

AWS: AWS Redshift, AWS S3 and AWS EMR.

Other tools: TOAD, SQL PLUS, SQL LOADER, MS Project, MS Visio and MS Office, C++, UNIX, PL/SQL etc.

PROFESSIONAL EXPERIENCE:

Sr. Data Architect

Confidential - Atlanta, GA

Responsibilities:

  • Interaction with Business Analyst, SMEs and other Data Architects to understanding
  • Business needs and functionality for various project solutions
  • Estimation of Project development time, Data Volume, Space usage
  • Propose, design and supported ETL implementation using Teradata Tools and Technology like BTEQ,MLOAD, FASTLOAD, FASTEXPORT, SQL ASSISTANT, and Teradata Parallel Transporter (TPT)
  • Used Erwin 7.3 as part of Data model changes
  • Responsible in creating the HLD,AID and ADD documents
  • Involved in code migration from DEV to QA and PRD
  • Involved in the red hat support
  • Involved in sustainment and red hat support activities
  • Participated in DRB presentations
  • Utilized various Transformations in Mappings like Joiner, Aggregate, Union, SQL, XML Parser, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router etc
  • Implemented standards for naming Conventions, Mapping Documents, Technical Documents and Migration form.
  • Involved in writing Shell Scripts to accumulate the MTD source file
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters
  • Providing support for the issues raised by Operations team in daily/weekly/monthly jobs
  • Working on different projects in each area, follow up with business for requirements, analysis, design, development, testing, deployment and post production support.
  • Design the ETL solutions for the problems by creating complex mappings and loading scripts.
  • Extracting data from various source systems like Oracle,SQL Server and flat files as per the requirements
  • Tuning of sources, Targets, mappings and SQL queries in transformations
  • Scheduled the Jobs in maestro(Tivoli)
  • Involved in Audit Designs to maintain a quality data checks between source and targets and internal audits
  • Currently I have been working as an Architect on various projects from last 5+ years in AT&T under contract. Involved in Designing new extracts and enhancing the existing extracts based on the business needs
  • Provide support in database architecture of the organization through database design, modeling and implementation
  • I have successfully provided the logical data models and ETL source to target data mapping for new data warehouse projects, including designing enhancements to existing data warehouse data models and creating data marts
  • 13 + years of experience in information technology
  • Excellent understanding of Data Warehousing Concepts
  • Expert in ETL strategies using Informatica PowerCenter 8x.
  • Extensive knowledge in UNIX shell scripting

Environment: Informatica Power Center 8.6.1(Repository Manager, Designer, Workflow Manager, Workflow Monitor), Oracle 11g/10g/9i, Teradata TD14/TD13/V nodes with 612 Terabytes), Windows XP, linux, SQL server, Test Director 8.0, Erwin7.3, Tivoli8.2, Visio

Sr. Data Architect

Confidential, Dallas TX

Responsibilities:

  • Designed and build relational database models and defines data requirements to meet the business requirements.
  • Developed strategies for data acquisitions, archive recovery, and implementation of databases.
  • Responsible for developing and supporting a data model and architecture that supports and enables the overall strategy of expanded data deliverables, services, process optimization and advanced business intelligence.
  • Delivered complex and ambiguous business requirements while elevating data architecture to the next level.
  • Analyzed the reverse engineeredEnterpriseOriginations (EO) physicaldatamodel to understand the relationships between already existing tables and cleansed unwanted tables and columns as part ofDataAnalysis responsibilities
  • Managed and reviewed Hadoop log files.
  • DevelopedLogicaland PhysicalDatamodelsby using ER Studio and its converted into physical database design.
  • Used SQL on the new AWS Databases like Redshift and RelationDataServices and orked with various RDBMS like Oracle 11g, SQL Server, DB2 UDB, and Teradata 14.1, Netezza.
  • Designed and developed SSIS Packages to import and export data from MS Excel, SQL Server 2012 and Flat files.
  • Involved in database development by creating Oracle PL/SQL Functions, Procedures and Collections.
  • Strong knowledge in Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, OLAP, OLTP, StarSchema,SnowFlakeSchema, Fact Table and Dimension Table.
  • Used data vault modeling method which was adaptable to the needs of this project.
  • Analyzed the web log data using the HiveQL to extract number of unique visitors per day, page views, visit duration, most purchased product on website.
  • Extracted thedatafrom MySQL, AWS RedShift into HDFS using Sqoop.
  • Involved in Automating and Scheduling the Teradata SQL Scripts in UNIX using Korn Shell scripting.
  • Deployed and scheduled Reports using SSRS to generate all daily, weekly, monthly and quarterly Reports including current status.
  • Developed Initio graphs to fetch data from Oracle, Teradata, Flat Files and mainframe Files.
  • Involved in Normalizationandde-normalizationOLAP and OLTP systems, process including relational database, table, constraints (Primary key, foreign key, Unique and check) and Indexes.
  • Managed and reviewed Hadoop log files.
  • Involved in data from different sources like Teradata, Oracle and text files usingSAS/Access,SASSQL procedures and createdSASdatasets.
  • Designed Source to Target mapping from primarily Flat files, SQL Server, Oracle,Netezzausing Data Stage.
  • Work on SQL Data warehouse using Azure for designing services to handle computational and data intensive queries in database.
  • Reverse Engineered existing Relational and data vault database systems as there were no existing data models for them.
  • Worked with ETLDevelopersin creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate Shire's data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of InformaticaMDMHub.
  • Responsible for fulldataloads from production to AWS Redshift staging environment.
  • Responsible for creating Hive tables, loading data and writing hive queries.
  • Utilized ODBC for connectivity to Teradata & MSExcel for automating reports and graphical representation ofdatato the Business and OperationalAnalysts.
  • Created/Generated source to target documents for onshore and offshore ETL team from ERwin
  • Extracteddatafrom existingdatasource, Developing and executing departmental reports for performance and response purposes by using oracle SQL, MSExcel.
  • Responsible for creating Hive tables, loading data and writing hive queries.
  • Performed data profiling of data vault hubs, links and satellites using Erwin generated SQL scripts
  • Designed Physical Data Model (PDM) using ERwin data modeling tool and PL/SQL and T-SQL Managed Meta-data for data models.

Environment: Erwin r9.5/9.1, Metadata, Netezza, Oracle11g, Taradata14.1, T-SQL, SQL Server 2012, DB2, SSIS, R, Python, Azure, Hadoop, Informatica 9.5, AWS, AWS Redshift, AWS S3, AWS EMR, Spark, Map Reduce, UNIX, HTML, Java, Aginity, MySQL, Hive, Pig, MDM, PL/SQL, SPSS, ETL, Data stage etc.

Confidential - Dallas, TX

Sr. Data Modeler/Data Architect

Responsibilities:

  • Involved in requirement gathering along with the business analysts group.
  • Gathered all the report prototypes from the business analysts belonging to different Business units
  • Gathered various requirement matrices from the business analysts.
  • Participated in Joint Application Development(JAD) sessions
  • Conducted Design discussions and meetings to come out with the appropriate Data Model
  • Designed for the various reporting requirements
  • Designed a logical data model using database tool Erwin r7.0
  • Used ER Studio to create logical and physical data models for enterprise wide OLAP system.
  • Developed monthly summary and downstream data marts from enterprise wide databases in accordance with reporting requirements with dimensions like time, customers, services and accounts.
  • Developed Star and Snowflake schemas based dimensional model to develop the data warehouse
  • Modeled the dimensions and facts using Erwin for centralized data warehouse.
  • Identified and tracked slowly changing dimensions and determined the hierarchies in dimensions
  • Actively participated in data mapping activities for the data warehouse.
  • Created summary tables using de-normalization technique to improve complex Join operations.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis pertaining to Loan products.
  • Participated in the tasks of data migration from legacy to new database system.
  • Worked on Metadata exchange among various proprietary systems using XML.
  • Conducted Design reviews with the business analysts, content developers and DBAs.
  • Designed and implemented a physical data model to handle marketing strategies and to satisfy reporting needs dynamically.
  • Organized User Acceptance Testing (UAT), conducted presentations and provided support for Business users to get familiarized with Loan products application.
  • Handled performance requirements for databases in OLTP and OLAP models.
  • Worked with the Implementation team to ensure a smooth transition from the design to the implementation phase.

Environment: Erwin 9.x, Teradata V14, Teradata SQL Assistant, Informatica Power Centre, Oracle 11g, Netezza, SQL Server 2008, Mainframes, SQL, PL/SQL, XML, Hive, Hadoop, PIG, Hadoop, SPSS, SAS, Excel, Business Objects, Tableau, T-SQL, SSRS, SSIS, XML, Tableau.

Sr. Data modeler/Data Analyst

Confidential - Iowa City, IA

Responsibilities:

  • Worked on the integration of existing systems at Data warehouse and Application systems level.
  • Extensively used SQL for Data Analysis and to understand and documenting the data behavior.
  • Reversed engineered existing data bases to understand the data flow and business flows of existing systems and to integrate the new requirements to future enhanced and integrated system.
  • Designed the procedures for getting the data from all systems to Data Warehousing system.
  • Worked with ETL Architects and developer to design performance centric ETL mappings.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Extensively worked on documentation of Data Model, Mapping, Transformations and Scheduling jobs.
  • Worked extensively with Business Objects Report developers in creating data marts and develop reports to cater the existing business needs.
  • Designed Mapping Documents and Mapping Templates for Informatica ETL developers.
  • Extensively used and created SAS/Macros for the efficiency and accuracy.
  • Deployed naming standard to the Data Models at enterprise level and followed company standard for Project Documentation.
  • Using the data Integration tool Pentaho for designing ETL jobs in the process of building Data warehouses and Data Marts.
  • Designing ERdiagrams, logicalmodel (relationship, cardinality, attributes, and, candidate keys) and convert them to physical data model including capacity planning, object creation and aggregation strategies, partition strategies, Purging strategies as per new architecture.
  • Designed and developed strategies for Data Conversions and Data Cleansing
  • Created Data mappings, Tech Design, loading strategies for ETL to load newly created or existing tables.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Created Schema objects like Indexes, Views, and Sequences, triggers, grants, roles, Snapshots.
  • Developed Star and Snowflake schemas based dimensional model to develop the data warehouse.
  • Developed statistics and visual analysis for warranty data using MS Excel, MS Access and Tableau Software.
  • Developed strategies and loading techniques for better loading and faster query performance.

Environment: Netezza, Agile, Oracle9i, Informatica 9.2, Taradata13.1, R, SAS, T-SQL, SQL Server, DB2, SSIS, ERWIN r9.1, Aginity, SSRS, MDM, DVO, PL/SQL, ETL, Data stage etc.

Sr. Data modeler/Data Analyst

Confidential

Responsibilities:

  • Created Design Fact & Dimensions Tables,Conceptual, Physical andLogicalDataModelsusing Embarcadero ER Studio.
  • Transferred data from various data sources systems including MS Excel, MS Access, Oracle10g and Flat Files to SQL Server, Taradata 14 using SSIS/DTS using various features like data conversion.
  • Worked as Sr Consultant for Wholesale Loans (Misys Loan IQ) in M&T.
  • Worked on theNetezzaAdmin Console when the issues were not solved at the session/workflow level.
  • Extensively used ER Studio for developingdatamodelusing star schema and Snowflake Schema methodologies.
  • Designed and developed complex interfaces with external systems using Oracle PL/SQL.
  • Worked on Enterprise Business Intelligence (BI) offering for the Loan IQ product and contains standard reports packaged to the Loan IQ customers.
  • Developed LINUX Shell scripts by using NZSQL/NZLOAD utilities to load data from flat files to Netezzadatabase.
  • Implemented Dimensionalmodelfor theDataMart and responsible for generating DDL scripts using ER Studio.
  • Designed ODS, and Data Vault with expertise in Loan and all types of Cards.
  • UsedNormalizationas well asDe-normalizationtechniques to process the records depending on the data record structure of the source and target tables.
  • Support for applications of Wholesale Loans and Trade technology like Global trade services, account payable account receivable, Loan IQ, LoanIQ Business Objects, Leasepak, Promerit, etc.
  • PerformedDataMapping between source systems to Target systems,logicaldatamodeling, created class diagrams and ER diagrams and used SQL queries to filterdata.
  • UsedNetezzaSQL to maintain MDM frameworks and methodologies in use for the company and also accessedNetezzaenvironment for implementation of ETL solutions.
  • Worked extensively with RDBMS systems likeOracleand SQLServer Comfortable in SQL, PL/SQL, Triggers, Stored procedures, Functions, Sequences, and Views etc.
  • Upgraded server from MS SQL Server 2005 to MS SQL Server 2008 and Experience in configuring and deploying SSIS and SSRS reports
  • Involved in debugging and Tuning the PL/SQL code, tuning queries, optimization for theOracle, Taradata, and DB2 database.
  • Used SAS to develop custom desktop solutions for the Stress Testing of the Commercial Loan portfolio of the bank
  • Developed ETL procedures for moving data from source to target systems.
  • Developed and deliver dynamic reporting solutions using SQLServer 2008 Reporting Services (SSRS)
  • Developed mapping spreadsheets for ETL team with source to target data mapping with physical naming standards, data types, volumetric, domain definitions, Transformation Rules and corporate meta-data definitions.
  • Used the Agile Scrum methodology to build the different phases of Software development life cycle.
  • Archived and retired several applications using Informatica ILM andDVOto archive the data and backload into the data warehouse.
  • Collaborate thedatamapping document from source to target and thedataqualityassessments for the sourcedata.
  • Worked on SSIS and DB2 Packages, DB2 Import/Export for transferring data from Heterogeneous Database (Text format data) to SQL Server.

Environment: ER Studio,Taradata14, Data Modeler, Netezza Aginity, Oracle10g, SSAS, T-SQL, Tableau, R, UNIX, HTML, Agile, SSRS, Informatica 9.2, Loan IQ, Java, DVO, MySQL Server, DB2, SSIS, Tableau, MDM, PL/SQL, ETL, etc.

Data Analyst

Confidential

Responsibilities:

  • Involved in Regression, UAT and Integration testing
  • Designed developed and implemented 2 professionally finished systems for tracking IT requests, and providing a Data repository about reports. Documented all system functionality
  • Participated in testing of procedures and Data utilizing, PL/SQL to ensure integrity and quality of Data in Data warehouse.
  • Metrics reporting, Data mining and trends in helpdesk environment using Access
  • Gather Data from Help Desk Ticketing System and write ad-hoc reports and, charts and graphs for analysis.
  • Worked to ensure high levels of Data consistency between diverse source systems including flat files, XML and SQL Database.
  • Develop and run ad-hoc Data queries from multiple database types to identify system of records, Data inconsistencies, and Data quality issues.
  • Involved in understanding the customer needs with regards to Data, documenting requirements, developing complex SQL statements to extract the Data and packaging/encrypting Data for delivery to customers.
  • Experience in performing Tableau administering by using tableau admin commands.
  • Worked with project team representatives to ensure that logical and physical Data models were developed in line with corporate standards and guidelines.
  • Involved in defining the source to target Data mappings, business rules and Data definitions.
  • Performed Data analysis and Data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Created Excel charts and pivot tables for the Ad-hoc Data pull.
  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Defined Data requirements and elements used in XML transactions.
  • Analyzed and rectified d Data in source systems and Financial Data Warehouse databases.

Environment: PL/SQL, UAT, XML, SQL, Tableau, Oracle 9i, and pivot

We'd love your feedback!