We provide IT Staff Augmentation Services!

Data Architect/data Analyst Resume

5.00/5 (Submit Your Rating)

MN

SUMMARY

  • Over 8 years of IT experience on Client - Server architecture, Data Modeling, Data Analysis, and Business applications in various phases of IT projects such as analysis, design, development, implementation and testing of Data warehousing using Erwin, Informatica (ETL), SQL, PL/SQL, and Oracle on OLAP and OLTP environment.
  • Implementation of Database Management Systems for Financial, Retail, Telecom, Healthcare & Insurance clients.
  • Involved in design of Dimension Models, Star Schemas and Snowflake schemas using Erwin.
  • Excellent knowledge of Bill Inmon and Ralph Kimball methodologies to design database architecture. Excellent knowledge on logical design and physical implementation for database design.
  • Experienced in Business requirements conformation, data analysis, data modeling, logical and physical database design and implementation.
  • Experienced in Normalization and Demoralization processes, Logical and Physical data modeling techniques.
  • Proven knowledge in capturing data lineage, table and column data definitions, valid values and others necessary information in the data models.
  • Working knowledge of Hadoop (HDFS) along with Map-reduce.
  • Strong skills in development of mappings, documentation, implementation and enhancement of ETL strategies.
  • Experience in designing and developing complex mappings applying various transformations such as Source Qualifier, Expressions, Sorter, Router, Filter, Aggregator, Joiner, Connected and Unconnected Lookups, Sequence Generator, Rank, Update Strategy and Stored Procedure.
  • Experience with Informatica Power Exchange 9.x/8.x for change data capture (CDC) and mainframe data.
  • Extensively worked with Informatica Data Quality (IDQ) for data analysis, cleansing, and data matching.
  • Extensively worked on Integration of Informatica MDM and Data Analyser in order to show End to End Lineage of Data elements and provide BI answers.
  • Extensively worked on Informatica Information Lifecycle Management (ILM) in order to mask sensitive information and support Test Data Management (TDM) applications.
  • Extensive experience in Oracle Development in table design, writing stored procedure, Database Triggers, Functions, PL/SQL packages, Oracle Cursor management, Exception Handling, MySQL and T-SQL.
  • Extensive experience in performance Tuning SQL and PL/SQL code by using Explain Plan Table, Trace and TKPROF to significant improvement in system response time and efficiency.
  • Programming skills using PL/SQL, MySQL, T-SQL and Shell/Python Scripts. Creation of scripts for collection of statistics, reorganization of tables and indexes and creation of indexes for enhancing performance.
  • Experience in performing and supporting Unit testing, System Integration testing (SIT), UAT and production support for issues raised by application users.

TECHNICAL SKILLS

CASE Tools: Erwin, Visio

ETL: Informatica (Power Center, Power Exchange, Developer Client, ILM, Metadata Manager, Data Analyzer, Analyst, Test Data Manager, IDQ), SSIS

Database: Oracle, Teradata, Netezza, DB2, UDB, MS SQL Server, MS Access

Tools: TOAD, Shell/Python/Ruby/Perl Scripting

BI Reporting Tools: Cognos, MicroStrategy, QlikView, OBIEE, Crystal Reports

Programming Language: PL/SQL, Java, C/C++

Operating Systems: UNIX, Windows

PROFESSIONAL EXPERIENCE

Confidential, MN

Data Architect/Data Analyst

Environment: Erwin, Informatica PowerCenter 9.5, Information Lifecycle Management (ILM), Metadata Manager, Data Analyzer, Oracle 11g, OBIEE, Windows Vista

Responsibilities:

  • Experience in Data Warehouse/Data Mart design, Data modeling (Logical/Physical Data models), System analysis, Database design, ETL design and development, SQL, PL/SQL programming.
  • Created mapping between the databases and identified the possibilities of incorporating the new business rules with the existing design.
  • Created prototype reporting models, specifications, diagrams/charts to provide direction to system programmers.
  • Prepared High level design (HLD), LLD, Project functional Specification and Business Req document (BRD).
  • Participated in Data Modeling, Data Analysis and user requirement gathering meetings.
  • Converted the Business rules into Logical/Physical model and then into Technical Specifications for ETL process.
  • Involved in all phases of SDLC from requirement, design, development, testing, pilot, training and rollout to the field users and support for production environment.
  • Identifying the facts & dimensions; grain of fact, aggregate tables for Dimensional Models.
  • Developed, Implemented & Maintained the Conceptual, Logical & Physical Data models.
  • Developing Snowflake Schemas by normalizing the dimension tables as appropriate.
  • Dimensional Data Modeling to deliver Multi-Dimensional STAR schemas.
  • Applied Data Governance rules (Class word, Abbreviations, Data Domains, Data Definitions, & Metadata Info).
  • Involved in capturing data lineage, table and column data definitions, valid values, and others necessary information in the data models.
  • Worked with data compliance teams, Data governance team to maintain data models, Metadata, Data Dictionaries; define source fields and its definitions.
  • Attributes and relationships were redefined, unwanted tables and columns were cleaned as part of Data Analysis.
  • Developed ETLs for Data Extraction, Data Mapping and Data Conversion using Informatica PowerCenter.
  • Extensively used Informatica Metadata Manager and Data Analyzer for Data analysis and mining.
  • Good knowledge on creating and applying rules and policies using ILM workbench for data masking.
  • Worked on Informatica Power center 9.5 Tool with XML as source and used - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformations.
  • Wrote scripts for collection of statistics, reorganization of tables and indexes, and creation of indexes for enhancing performance and performance tuning for data access.
  • Extensively used Explain Plan Table, Trace and TKPROF in performance Tuning SQL and PL/SQL code to significantly improve system response time and efficiency.
  • Used awk, sed, tar, grep commands, sql plus for connecting and loading data through UNIX along with Shell/Python scripting.
  • The design document was finalized through conduct of design walkthroughs, and independent review by the user. Prepared Technical Specifications for all the utilities, as per the company's standards.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Informatica mappings.
  • Working knowledge and brief experience with Hadoop (HDFS) and map-reduce.

Confidential, GA

Data Architect/Data Modeler

Environment: Erwin, Informatica PowerCenter 9.5, Oracle 11g, OBIEE, Windows Vista

Responsibilities:

  • Experience in Data Warehouse/Data Mart design, Data modeling (Logical/Physical Data models), System analysis, Database design, ETL design and development, SQL, PL/SQL programming.
  • Created mapping between the databases and identified the possibilities of incorporating the new business rules with the existing design.
  • Created prototype reporting models, specifications, diagrams (Logical/Physical models) and charts to provide direction to system programmers.
  • Prepared High level design (HLD), LLD, Project functional Specification and Business Req document (BRD).
  • Participated in Data Modeling, Data Analysis and user requirement gathering meetings.
  • Converted the Business rules into Logical/Physical model and then into Technical Specifications for ETL process.
  • Involved in all phases of SDLC from requirement, design, development, testing, pilot, training and rollout to the field users and support for production environment.
  • Involved in requirement gathering and database design and implementation of star-schema, snowflake schema/dimensional data warehouse using Erwin.
  • Played the role of a Data warehouse architect in the Business Intelligence Enterprise Architecture group for providing data architectural strategies.
  • Involved in Architecture and design of Data extraction, transformation, cleaning and loading.
  • Involved in Requirement gathering and source data analysis for the Data warehousing projects.
  • Involved in the Logical and Physical design and creation of the ODS and data marts.
  • Converted the Business rules into Technical Specifications for ETL process into MySQL database.
  • Involved in all phases of SDLC from requirement, design, development, testing, pilot, training and rollout to the field users and support for production environment.
  • Implemented mapping techniques for Type 1, Type 2 and Type 3 slowly changing dimensions.
  • Developed ETLs for Data Extraction, Data Mapping and Data Conversion using Informatica PowerCenter.
  • Involved in the Development of Informatica mappings and mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
  • Worked on Informatica Power center 9.0.1 Tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.
  • Wrote scripts for collection of statistics, reorganization of tables and indexes, and creation of indexes for enhancing performance for data access.
  • Involved in Testing and Test Plan Preparation, Process Improvement for the ETL developments.
  • Involved in Data analysis, Data Quality Assurance, created validation routines, unit test cases and test plan.
  • The design document was finalized through conduct of design walkthroughs, and independent review by the user. Prepared Technical Specifications for all the utilities, as per the company's standards.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Informatica mappings.
  • Prepared Technical documents for all the modules developed by our team.
  • Working knowledge and brief experience with Hadoop (HDFS) and map-reduce.

Confidential, GA

Data Architect/Data Analyst

Environment: Erwin, Informatica PowerCenter 9.0.1, SQL server 2008, MySQL, Seaquest, Yotta (HP Internal database and BI tools), Vertica (Big Data), Windows Vista

Responsibilities:

  • Experience in Data Warehouse/Data Mart design, Data modeling (Logical/Physical Data models), System analysis, Database design, ETL design and development, SQL, PL/SQL programming.
  • Created prototype reporting models, specifications, diagrams/charts to provide direction to system programmers.
  • Prepared High level design (HLD), LLD, Project functional Specification and Business Req document (BRD).
  • Participated in Data Modeling, Data analysis and user requirement gathering meetings.
  • Played the lead role in Designing and implementing Pricing Analytics, Best Customer and Finance Master Data Alignment Services (MDM) Data marts.
  • Involved in managing the delivery of data extracts from sources to the warehouse and downstream applications.
  • Involved in requirement gathering and database design and implementation of star-schema, snowflake schema/dimensional data warehouse using Erwin.
  • Played the role of a Data warehouse architect in the Business Intelligence Enterprise Architecture group for providing data architectural strategies.

Confidential, MA

Team Lead / ETL Developer

Environment: SQL server 2005/2008, MS SQL Server Integration Services (SSIS), SSRS, Windows Vista

Responsibilities:

  • Experience in Data Warehouse/Data Mart design, System analysis, Database design, ETL design and development, SQL, PL/SQL programming.
  • Developed ETLs for Data Extraction, Data Mapping and Data Conversion using SSIS.
  • Experience in creating SSIS packages using Active X scripts and with Error Handling.
  • Experience in using SSIS tools like Import/Export Wizard, Package Installation, and SSIS Package Designer.
  • Experience in enhancing and deploying the SSIS Packages from development server to production server.
  • Used ETL (SSIS) to develop jobs for extracting, cleaning, transforming and loading data into Data Warehouse.
  • Extensively used SSIS transformations such as Lookup, Fuzzy Lookup, Derived column, Data conversion, Aggregate, Conditional split, Term Extraction, Pivot Transformation, SQL task, Script task and Send Mail tasks.
  • Created SSIS Reusable Packages to transferring heterogeneous data from Multi formatted Flat files, Excel, XML files, DB Source (Oracle) into to SQL Server 2005/2008.
  • Monitored and Tuned MS SQL Server databases with tools like Index Tuning Wizard, SQL Profiler, and Windows Performance Monitor for optimal Performance.
  • Used Execution Plan, SQL Profiler and Database Engine Tuning Advisor to optimize queries and enhance the performance of databases.

Confidential, MI

ETL Developer

Environment: Informatica PowerCenter 8.6 & 7.1, Informatica Power Exchange, IDQ, COGNOS 8.3, SQL server 2005, Netezza, Windows Vista

Responsibilities:

  • Involved in data analysis and development to understand attribute definitions for migration.
  • Prepared mapping specifications documents, unit testing documents for developed Informatica mappings.
  • Prepared mapping documents for loading the data from legacy system, flat files to staging area.
  • Worked with NZSQL and NZLOAD utilities to load data into Netezza from flat files.
  • Worked with Informatica Power Exchange for change data capture (CDC) and mainframe data.
  • Worked on Informatica Power center 8.5.1 Tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy transformation.
  • Involved in the design and development of mappings from legacy system to target Netezza database.
  • Used Informatica Data Quality (IDQ) for data profiling and cleansing.
  • Used Netezza Groom command to reclaim space for table space.

Confidential, MI

ETL Developer

Environment: Informatica PowerCenter 8.6, Informatica Power Exchange, Informatica Data Quality (IDQ), Teradata, COGNOS8.3, SQL server 2005, Windows Vista

Responsibilities:

  • Prepared mapping specifications documents, unit testing documents for developed Informatica mappings.
  • Prepared mapping documents for loading the data from legacy system, flat files to staging area.
  • Prepared mapping documents for loading the data from staging area to ODS.
  • Defined Target Load Order Plan for loading data into different Target Tables.
  • Worked on Informatica Power center 8.5.1 Tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.
  • Involved in the design and development of mappings from legacy system to target data base.
  • Involved in performance tuning of the mappings by doing necessary modification to reduce the load time.
  • Used Informatica Data Quality (IDQ) for data profiling and cleansing.
  • Extensive experience in writing stored procedure, Database Triggers, Functions, PL/SQL packages and T-SQL.
  • Optimized Query Performance, Session Performance and Reliability.
  • Used pmcmd, mailx command to execute jobs from UNIX platform.
  • Coordinated with DBA team to resolve issues pertaining to load timings and any other database related issues which are hindering performance of loads.
  • Supported UAT and production testing by identifying and resolving bugs, and Data/Load Issues.

We'd love your feedback!