We provide IT Staff Augmentation Services!

Data Analyst / Data Architect Resume

NyC

SUMMARY:

A highly skilled and experienced Data Engineer with demonstrated success in designing and implementing projects related to data warehousing, Business Intelligence, Data Integration, Enterprise Dashboards, Master Data Management and Enterprise Data Quality, Dimensional Modeling & Big Data.

CORE COMPETENCIES:

  • More than 15 years of experience in Data warehousing technology.
  • Expert in writing complex SQL’s to analyze, profile and find patterns in Data.
  • Complex SQL writing in HADOOP (Hive SQL), Teradata, Oracle, and SQL Server 2012.
  • Over 5 years of data modeling experience. Heavy experience on BI tools and BI Architecture in the past.
  • Enterprise Data Quality related to MDM and Compliance Projects. Experienced Architect in implementing Data Quality solutions.
  • ETL Architecture and ETL Developer (Including hands on Custom Developed ETL Scripts)
  • Data Analyst for the BASEL II data integrity projects in banking commercial stream
  • Data Analyst and Designer for Enterprise data quality initiatives
  • Data Modeling - Logical Modeling and Physical Modeling. Experience in Canonical Data Model..
  • BI Architect - Experienced in using both MOLAP & ROLAP tools, Operational reporting tool and know how to evaluate BI Tool.
  • Data Architect - Experienced in data integration and ETL Architecture and a clear understanding of data structures. Designed near real time ETL Architecture for a client.
  • MDM Consultant specific strength on Customer and Product Domain. Worked on different MDM products with key expertise on Data quality, Entry Criteria Rules, Match - Merge & The survivorship rules..
  • Big Data - AWS, HDFS, Impala, Hive, Google Big Query, Cassandra, Amazon S3 Bucket, Cloudera, Acxiom Links & RELTIO MDM.
  • Expertise on Data Acquisition / Data migration, data conversion and integration projects.
  • Worked on end-to-end implementation of Enterprise Data Quality, which is part of the Data Governance.
  • Highly experienced on migrating data from legacy systems (Mainframes - VSAM, QSAM files, IMS DB/DC Hierarchical Database, COBOL) to Oracle / SQL Server staging areas. Developing interfaces (ETL tools) between legacy systems to Oracle staging.
  • Worked on analysis, profiling & implementation (design, coding & testing) of BASEL II - Data quality monitoring and profiling tool for the enterprise wide data elements.
  • Highly experienced in Data analysis and profiling using Informatica Data profiling tool - Informatica Data Explorer and SQL queries. Experienced in using SSIS Data Profiling task and Viewer.
  • Highly experienced in using Informatica Power Center and Informatica Data Quality (IDQ) workbench for coding the business rules related to enterprise data elements - A BASEL II Data Quality Initiative.
  • Worked extensively on source to target mappings, data requirements templates, data certification templates, end to end data lineage, requirements specification documents, element discovery forms, data flow diagrams & data models.
  • Worked on creating / modifying the enterprise data element Business rules / data rules specifications by profiling the data through Informatica Data Explorer and SQL.
  • Data Cleansing rules for MDM Projects
  • Worked on Customer & Product domains in MDM
  • Extensive analysis on Web traffic and click stream data
  • Have worked on legacy Mainframe Projects (COBOL, JCL, VSAM, DB2 & QSAM)
  • Current expertise on Microsoft ETL - SQL Server Data tools (Using Visual Studio)

CORE COMPETENCIES:

  • Data Warehousing
  • ETL and Data Quality
  • Informatica power center
  • Informatica Data Quality
  • Informatica Data Explorer
  • SQL Server 2008, 2012 SSIS
  • SSIS Data Profiler, Data profile Viewer
  • MDM
  • Infosphere (1 Year Ex; Last Worked 2012)
  • Talend (Pilot Project At Gap International)
  • OLAP-BI
  • Microstrategy BI Suite (Architect, Desktop, Narrowcast Server, Administrator, Web, Security & Designer), MS-OLAP Analysis Services (Knowledge)
  • Business Objects
  • COGNOS BI Suite
  • SSRS, SSAS
  • COGNOS Report Net (Report Studio)
  • Actuate Ver 6.00, Erd Pro - Report Designer Professional, Actuate Active Portal
  • Data Modeling (Logical & Physical Data Modeling)
  • CA Erwin
  • Embarcadero (ER Studio DA)
  • Technical & Data Architect for Data warehousing Projects
  • Extract, Transform, and Load (ETL) / Data Staging Design
  • Data Quality Reporting schema design and implementation

TECHNICAL SKILLS:

Databases: Oracle, DB2, MS SQL Server and database, DB2/400, IMS, MS Access, VSAM, MySQL; SQL Server 2008 Enterprise edition; Teradata, HADOOP

Programming Languages & Tools: Unix Shell scripts, VB - Macro Development (Excel), TOAD, MS Visio, BonaVista microcharts; XML, SQL, Oracle PL/SQL, Hive SQL

Domain Knowledge: Dodd Frank Banking Regulatory Compliance, BASEL II EDQ project implementation, Auto Finance, Banking - Retail & Commercial, Inventory Retail, Online Purchase E-Biz, Sales & Marketing, Health Care, Global Customer Relationship Management (GCRM - Focus on Universal Customer ID)

Legacy Systems and Programming Languages / Tools: Mainframes, CL/400, Cobol, RPG/400, SQL/400, MF-Cobol, RPG/400, TELON, VS COBOL-II, TSO/ISPF, JCL’s

Change Control Tools: Peregrine Service Center, VSS

Big Data: AWS, HDFS, Impala, Hive, Google Big Query, Cassandra, Amazon S3 Bucket, Cloudera,Acxiom & RELTIO MDM

PROFESSIONAL EXPERIENCE:

Confidential, NYC

Data Analyst / Data Architect

Environment: SCRUM, CRM, Cloud - AWS, HDFS, Impala, Hive, Google Big Query, Cassandra, Amazon S3 Bucket, Cloudera,, Acxiom & RELTIO MDM, Teradata, SQL Server, Demandware, Sitecore, Legacy Source Systems, SFMC

Responsibilities:

  • Assess the data quality of the existing databases.
  • Data Analysis Data Discovery, Data Profiling, Data rules.
  • Creating end to end data lineage
  • Enterprise data dictionary
  • Design and Build Data Mart
  • ETL Load strategy for Data Mart
  • Worked on Change data capture and implementation of real time updates.
  • Translate the conceptual model into logical and physical models using Erwin and create the DDL scripts from physical model.
  • Horizontal Message Architecture - Capture data in XML for every instance and feed into a super flat structure. This is the basis for any downstream feed.

Confidential, New Jersey

Data Analyst & Database Developer

Environment: Oracle Database, SQL Server Database, TOAD, CA Erwin Data Modeler, MS Visio, XML, Informatica Power Center, SQL, VSS, Informatica IDQ, Legacy systems - Mainframes, Infosphere

Responsibilities:

  • Key inputs to the BRD
  • Architect the solution
  • Data Analysis Data Discovery, Data Profiling, Data rules.
  • Creating end to end data lineage
  • Understand the MDM solution purchased from IBM and define the key data elements that will feed into MDM.
  • Identify Data Cleansing Rules, Address validation rules, matching rules and survivorship rules for creating the golden record of the customer. Goal of the MDM solution is to create that golden record of the customer.
  • Integrate the Output from MDM into all commercial and retail sources
  • Data flow diagrams for Customer and Lender models from source to target.
  • Responsible for source data profiling & analysis
  • Source Data integration for staging, lender and customers models
  • Create the Logical models by working closely with the SME’s and BA’s.
  • Work on Physical Data Models and create the DB Objects with the help of DBA
  • Spec out details for Customer and Lender data loads.
  • Develop data warehouse process models, including sourcing, loading, transformation, and extraction.
  • Enterprise data dictionary
  • UAT/ BAT deployment. Monitor Data load Performance. Perform regression testing.

Confidential, New Jersey

Data Analysis & Data Modeling

Environment: Oracle 10g, My SQL, SQL Server 2000, Business Objects - Crystal, TOAD, XML, CA Erwin, Informatica Power Center, Sybase, Advanced excel reporting -(including Bona Vista Charts)

Responsibilities:

  • Feasibility study on Report Metrics and columns
  • End to end data lineage
  • Data analysis & Data Profiling
  • Data Cleansing rules
  • Design Physical Models
  • Design & Build data warehouse
  • High level and detail level design documents.
  • SYBASE queries to analyze data and identify gaps for the advanced excel reporting (BonaVista)

Confidential, California

Data Analyst, Data Modeling & Data warehouse designer

Environment: Oracle 10g, Oracle 9i, MS Visio, SQL Server Database, Informatica Power Center, TOAD & Informatica IDQ & IDE, COGNOS BI, UNIX, Shell Scripts

Responsibilities:

  • Understand & gather the BASEL II compliance requirement and make their information system ( Corporate Data Warehouse) BASEL II ready.
  • Gap assessment of the existing information systems (Warehouse,datamarts & Staging areas) for BASEL II readiness
  • Develop approach and design on the RMS Data migration to the data warehouse environment.
  • Profile the data & End to End data lineage for various key data elements and its attributes and prepare documents like D.R.T & S2T.
  • Source Data profiling and Analysis
  • Working with the client to identify the Data Elements that will be fed as input to the BASEL Risk Calculator engine, Scorecard & default Database engine etc
  • Address the various Data Quality issues in their information system - Corporate Data warehouse and give remediation.
  • To in corporate the functions of data aggregation, validation, categorization, standardization, and matching for addressing Data Quality issues.
  • Co-coordinating with the client on defining the project road map and projects related to BASEL II.
  • Architect a solution to build a data quality tool that will monitor the key BASEL II data elements for its completeness and consistency and validity of its data.
  • Develop various Business Rules with the help of data stewards using the data profiling tools Informatica Data Explorer and Complex SQL’s
  • Code these Business and technical rules in the Informatica Power center and Informatica Data Quality workbench to measure the DQ Error % and collect various Enterprise Data Quality metrics. The erroneous data is populated into a dedicated Reporting Schema. The objective is to measure the data quality for enterprise wide data elements which are key for the BASEL II compliance.
  • Design the Data Quality Reporting Schema

Confidential, San Diego

Data Analyst and Business Intelligence Report Developer

Environment: Tools - SQL PLUS, Actuate ERD Pro, Actuate Server, Lotus Notes, Oracle 9.i, COGNOS BI, COGNOS Report Net, COGNOS Metrics Manager

Responsibilities:

  • Gaining domain knowledge Auto Finance sector
  • Understanding the data flow from ultimate source to targets. Requirements capturing both functional and software.
  • S.O.W, S.R.S
  • Capture the functional requirements - Data Aggregation, Granularity and metrics.
  • Define data and reporting standards
  • Developing Reports in Actuate
  • Developing Reports in COGNOS Report Net
  • COGNOS BI
  • Data Analysis
  • Client interaction and leading the projects.

Confidential, Austin

Data Analyst, data warehouse developer and Business Intelligence Report Developer

Environment: SQL Server, SQL Query Analyzer, Enterprise Manager, DTS, MICROSTRATEGY 7I

Responsibilities:

  • Done extensive web traffic analysis.
  • Build the Snowflake model using DSS Architect
  • Designing Report Templates in DSS Agent
  • Scheduling the Reports Through DSS Scheduler
  • The goal is to find ways to increase conversion for all online stores of Compaq ( Confidential ). These findings are published as white papers to the key decision makers.
  • Initiated many projects that had analysis like Purchase Path or Funnel, Conditional Convergence, Survival Analysis, Deep dive on Understanding the Customer behavior on web pages, Traffic Analysis, Online Campaign Analysis, Market Basket Analysis and publishing the results on White Papers.
  • Data mining – Preparing the data for Predictive modeling, online customer behavior

Confidential

Data warehouse Consultant

Environment: Micro Strategy Tools (Architect, Desktop, Narrowcast Server, Administrator, Security & Designer) MS Access, VB Coding, SQL/400, DB2/400, MS SQL Server, VSAM, UNIX(Tru 64), Win /NT Software: SQL Loader, MicroStrategy 7i, VS COBOL-II, JCL, TSO/ISPF, Power Term, Oracle 8I, Telnet, DB2, IBM-MVS, OLAP, SQL*Plus, Unix Shellscripts, Oracle 7.x

Responsibilities:

  • System Study New Ver of L.D.M and P.D.M design for Warner s. SQL Scripts for Lookup & Fact Loads.
  • Build the Snowflake model using DSS Architect
  • Designing Report Templates in DSS Agent
  • Scheduling the Reports Through DSS Scheduler
  • Develop & Test Daily Management Reports using MSTR.
  • User interaction for identifying New End User Reports
  • Developed application in Excel VB for automating the format of reports and distribution of the reports.

Confidential

Data warehouse developer

Responsibilities:

  • Designing the migration cycle at each plant.
  • Records management system data migration and integration
  • Project Planning and Monitoring.
  • Design, Coding, Testing and Implementation.
  • Checking data integrity before and after migration.
  • Client Interaction, Report requirements capturing and developing the same through EIS.
  • End User Training and Education.
  • Data Analysis
  • BI Tool Evaluation for choosing the right tool
  • Estimates
  • BI Architecture
  • Security & Privileges MSTR
  • Build the Snowflake model using DSS Architect
  • Designing Report Templates in DSS Agent
  • Scheduling the Reports Through DSS Scheduler

Confidential

Systems Executive

Environment: VSCOBOLII, TELON, DB2, VSAM, QSAM, ENDEVOR, FOCUS, FILE-AID, TSO/ISPF, ETL Design

Responsibilities:

  • Development of Programs in COBOL for data extraction, Cleansing and loading into Data Warehouse.
  • The module includes 45 batch COBOL programs, which takes care of data extraction, Cleansing operations for the data that comes from different Source Systems and Loading into Warehouse. The warehouse is built in DB2.
  • Preparing Specs, Test Plans, Programming and Unit Testing.

Hire Now