We provide IT Staff Augmentation Services!

Data Architect (lead/administrator) Resume

4.00/5 (Submit Your Rating)

Pleasanton-cA

EXPERIENCE SUMMARY:

  • 15+ yearsof IT experience inData Warehousing,MDM,ETL,Business Intelligentand Data Modeling. Successfully implemented large scale implementations with major clients in High Technology, Healthcare and Financial Services sectors.
  • End to End delivery of executive dashboards to support business model transformation (Cross - functional data, High performance architecture - aggregate based models, Talend and Tableau)
  • Expertise in leading ETL and BI tools/architecture involving Talend, Tableau, IBM Datastage, Informatica, MySQL, Oracle and Teradata
  • Sound business knowledge of enterprise domains sales, marketing, finance, Product and master data
  • Strong experience in Data Warehousing projects and familiar with all areas of Data warehouse like ETL, BI, Reporting and data modeling (dimensional modeling)
  • Experience in Tableau Administration/development and architecture.
  • Extensive Tableau Experience in Enterprise Environment and Tableau Administration.
  • Expertise in handling large data volume objects and optimize ETL job performance to fit into committed service levels
  • In Depth knowledge in Data Warehousing & Business Intelligence concepts with emphasis on ETL and Life Cycle Development including requirement analysis, design, development, testing and implementation
  • Strong OLAP, SQL, PL/SQL and Business Analysis skills. Excellent skills in troubleshooting business critical issues and implement quick resolutions
  • Proficient in gathering the business requirements and creating technical specifications documents
  • Proficient in creating stored procedures using PL/SQL for ETL process.
  • Experienced in creating optimized Star Schemas.
  • Expertise in database level tuning activities - creating Indexes and partitioning tables for performance.
  • Analyzing, designing, developing and integrating MicroStrategy 9.x reporting applications targeted both for web and mobile platforms
  • Extensive Experience in developing Reports, Dashboard & Scorecard Reports in MicroStrategy Desktop and MicroStrategy Web Interfaces
  • Hands on UI development perform data modeling, application development, technical product assistance and performance tuning to meet customer performance and functional requirements
  • Proficiency with MicroStrategy Dynamic Dashboard, Flash Widgets & Mapping, Report development, Mobile Application development and project design: Facts, Attributes, Hierarchies, Aggregation, Portal Integration, etc
  • Specify business systems (inputs, outputs, data, human and automated interfaces) to meet information processing objectives for MicroStrategy's software platform
  • AsData warehousing Consultant, Involved in analysis, design, development of Data warehouses using Talend, Tableau, Datastage8.0/7.x/6.0, QUALITY STAGE, INFORMATICA, COGNOS, ERWin 4.x /3.5.2, Sql Server, Oracle 11g, SQL, PL/SQL, VB, ASP, COM and MTS

TECHNICAL SKILLS:

Data Warehousing Tools: Talend 6.2,Tableau 10.5,IBM InfoSphere DataStage 8.1/7.5/7.1/5. X, Informatica (PowerMart & PowerCenter) 10.1, 9.5, 8.5, 6.2/5.0, Pentaho, Cognos 7.3, Business Objects, Teradata

Functional Expertise: Master Data, Sales Forecasting, Sales Pipeline, Opportunity, Leads, Quoting, Order Management (Booking, Billing, Backlog, Shipment, Invoice), Oracle Financials, Hyperion, Install base, Marketing Intelligence, Customer 360, SFDC, Workday, Service Now

Engagement Model: Onshore, Offshore coordination

Operating Systems: MS-DOS, WINDOWS 95/98/2000, WINDOWS NT, UNIX

Programming Languages: C, C++, SQL, PL/SQL, UNIX SHELL SCRIPTING.

RDBMS: ORACLE, TERADATA, Netezza, SQL SERVER 2012, DB2

Database Admin: ORACLE 10g, 11g

Reporting Tools: Tableau 6, 7 and 8.1 and 10.5 versions, QLIK view, Microstrategy, OBIEE, ACTUATE, CRYSTAL

OLAP Tools: Tableau,COGNOS EP SERIES 7.3(IMPROMPTU, POWERPLAY, TRANSFORMER, ACCESS MANAGER, ARCHITECT, COGNOS SCRIPTING), MicroStrategy, ESSBASE

Data Modeling Tools: ERWIN 3.5.2, Oracle Data Modeler

ETL Tools: Talend, IBM DATA STAGE 8.1/7.5 (Information server), QUALITY STAGE, INFORMATICA

Oracle Utilities: SQL LOADER, IMPORT/EXPORT

Performance Tuning: Partitioned Architecture, Materialized Views, Indexing/Hints, Collecting Stats

PROFESSIONAL EXPERIENCE:

Confidential, Pleasanton-CA

Data Architect (Lead/Administrator)

Responsibilities:

  • Designed and developed the architecture for enterprise data warehouse.
  • Designed enterprise data warehouse sub-system using ERWIN modeling tool.
  • End to End delivery of executive dashboards to support business model transformation (Cross-functional data, High performance architecture - aggregate based models, Tableau)
  • Sales Analytics (predictive analysis of forecast, renewals, annual contract values (ACV), Annual recurring revenue (ARV), Forecast, leads, opportunity, account, PS, cases data)
  • Developed the overall scope and strategy for the foundational aspects of Opportunity, Quote, Sales, Marketing, Product and Finance
  • Analyzing, designing, developing and integrating MicroStrategy 9.x reporting applications - targeted both for web and mobile platforms
  • Partnered effectively with business stakeholders in mapping business model changes, process enhancements and ensured seamless reporting capabilities.
  • Subject matter expert in the sales process - Leads, Opportunities, Quoting, Order Management and Accounts.
  • Working with SME's and business users to collect business requirements and understand business process, responsible for prepare Design, specifications, data models and mapping specifications.
  • Management and implementation of database models, data flow diagrams, database schemas, db scripts, DTD schemas, structures and data standards to support a robust data management infrastructure.
  • Specify business systems (inputs, outputs, data, human and automated interfaces) to meet information processing objectives for MicroStrategy's software platform
  • Successfully executed a customer 360 initiative covering global accounts in record timeframe using creative architectural and technical concepts
  • Implement best practices and methodologies in data analysis, design and modeling
  • Working on Extraction, Transformation and Load (ETL) strategy, design and implementation using Talend
  • Extracting web service related data using JSON
  • Writing Java program for SQL Analytical functions which is difficult in ETL/Talend Tool, example: Looping concepts, Lead/Lag functions
  • Extensively Using Talend components tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tOracleInput, tOracleOutput, tfileList, tDelimited etc
  • Created Talend Jobs to retrieve data from Legacy sources and also to retrieve user data from the Flat files on monthly and weekly basis.
  • Performed requirement analysis of large-scale business systems and translated requirements into Statement of Work, Business Requirement Documents, Use Cases, Functional Specifications, Project Proposals, Testing and Post Implementation Plans.
  • Responsible for gathering requirements and understanding the source systems to implement EDW/MDM/BI.
  • Responsible for implementing Tableau in distributed environments across organization.
  • Responsible for development of Reports and Dashboards using Tableau.
  • Responsible for tool evaluation for Data Integration and Data Visualization.
  • Responsible for Building Sales, Marketing and Finance reports and dashboards.
  • Created Informatica Mappings to load data using transformations like Source Qualifier, Aggregator, Expression, Router, Union, Joiner, Connected and Unconnected lookups, Filters, sequence generator and Update Strategy.
  • Performed tuning of Informatica sessions by implementing partitioning in informatica level, increasing block size, data cache size, sequence buffer length with the help of DBA's, target based commit interval and SQL overrides.
  • Used Informatica Power Center (Designer/Manager/ Monitor and Repository Manager) for Down and Up Stream Feeds. Developed Mappings, Mapplets & workflows.
  • Designed Complex Informatica mappings with transformations such as Aggregator, Expression, Joiner, Filter, Source Qualifier, Union Transformation, connected/unconnected Lookups, Update Strategy, and Router to transform.
  • Maintaining weekly/monthly/quarterly metrics for topics/projects
  • Experience in working with Tableau server to create data sources, creating users and Tableau installation and providing to Business users in using Tableau.
  • Responsible for creating Tableaumodels, architectures, frameworks and strategies based on business requirements.
  • Performed administrative and support activities across Tableau infrastructure including security administration, installation, and configuration of Tableau Server.
  • Defined best practices during implementation and integration
  • Responsible for documentation related to Tableau server architecture and report development.
  • Actively involved in the creation of users, groups, projects, workbooks and the appropriate permission sets for Tableau server logons.

Environment: Tableau 10.5, Informatica 10.1, Talend 6.5, SFDC, Workday, MySql, Sqlserver

Confidential, Sunnyvale-CA

ETL Lead/Developer

Responsibilities:

  • Senior ETL architect - successfully deployed large scale transformational capabilities for the Sales function.
  • Collaborated successfully with cross-functional teams to source enterprise information from disparate source systems - SFDC, Oracle ERP R12, 11i, Siebel, Oracle DRM Hyperion, SAP to develop customer 360 views and Sales compensation models.
  • Successfully developed customer 360 account dashboard by aggregating data from 20 different source systems
  • Designed scalable architecture to support sales reporting by different hierarchies (Channel, Commission, Geography)
  • Deployed successful fiscal-year transition projects that implement new compensation models, customer segmentation models and account coverage models.
  • SME responsible for the Sales Bookings subject area which is a critical business decision enabler.
  • Expert in managing data flow and processing of high volume data objects in the range of 200M rows.
  • Expertise in designing end to end Job flow frameworks and process to ensure job completion within Service Levels. SLA Management is very critical in 24x7 reporting environments
  • Implemented Master/Slave switch architecture to support 24x7 availability
  • Provided ETL architecture guidance and recommendations surrounding DataStage environment. Assisted the project manager in the ETL project planning, resourcing, ETL Designs and developing conceptual designs
  • Responsible for delivery of multiple parallel projects
  • Designed and developed various data stage jobs using version 8.5
  • Document, Develop and review of the Technical specification, Data mapping documents
  • ETL Jobs Design and development
  • Design, develop, and implement test strategies for the ETL process
  • Converted Datastage jobs into Pentaho for Sales modules.
  • Ensured successful delivery using onshore/offshore model
  • Mentored offshore resources to take on additional responsibilities to ensure scalable resource pool and business continuity

Environment: IBM Information Server (IBM WebSphere DataStage and QualityStage 8.5), Pentaho, SFDC, Oracle Financials, Oracle EBS R12 (Quoting, Order Management, Manufacturing), Siebel, UNIX, Oracle 10g/11g, SAP

Confidential, San Jose-CA

ETL Developer

Responsibilities:

  • Provided ETL architecture guidance and recommendations surrounding DataStage environment. Assisted the project manager in the ETL project planning, resourcing, ETL Designs and developing conceptual designs
  • Responsible for delivery of multiple parallel projects
  • Designed and developed various data stage jobs using version 8.1
  • Developed Advanced DataStage jobs to read data form JDBC ( Java Applications)
  • Document, Develop and review of the Technical specification, Data mapping documents
  • ETL Jobs Design and development
  • Design, develop, and implement test strategies for the ETL process

Environment: IBM Information Server (IBM WebSphere DataStage and QualityStage 8.1), Informatica PowerCenter 8.5,C++, Java,Windows 2000, UNIX, Oracle 10g, DB2

Confidential -CA

ETL Architect/ETL Admin

Responsibilities:

  • Working with the Business/analysts to analyze and architect new processes for implementing Business Intelligence solutions
  • Document, Develop and review of the Technical specification, Data mapping documents
  • Design, develop, and implement test strategies for the ETL process
  • Bug fixing and provide support to the Business Intelligence and Business Users team.
  • Performing Data Analysis and develop Dimensional Modeling, E-R modeling.
  • Involved in Designing Fact and Dimensional Tables, Slowly Changing Dimensions.
  • Involved in Performance Fine Tuning of ETL programs. Wrote SQL stored procedures, distributed applications, triggers and packages.
  • Working with Infra Structure team to maintain and administer for the ETL environment
  • Provide on-call support functions as needed.
  • Working with business ops team and Business Analyst. for gathering actual requirement from business and involving with them to write BRD's
  • Based BRD's writing FSD's for every cisco quarter requirement based on time and resource availability
  • Worked in all phases of PDLC from requirement, design, development, testing, support for production environment.
  • Creating Informatica Mappings to load data using transformations like Source Qualifier, Aggregator, Expression, Router, Union, Joiner, Connected and Unconnected lookups, Filters, sequence generator and Update Strategy.
  • Landing data safely from Teradata to oracle staging env as part of daily and weekly incremental extraction jobs
  • Implemented analytical functions.
  • Identifying long run jobs (Informatica jobs, oracle procedures) and working with performance team and implementing changes based on perf team.
  • Performed tuning of Informatica sessions by implementing partitioning in informatica level, increasing block size, data cache size, sequence buffer length with the help of DBA's, target based commit interval and SQL overrides.
  • Modifying existing procedures and shell scripting based on new enhancements.
  • Generating weekly and monthly Anomaly and DDR reports, validating and sending to business team.
  • Creating Uprocs, tasks, creating dependencies in Dollar Universe to schedule/Run workflows.
  • Wrote test cases and evaluate test cases for Unit, Integration and system Testing; opening a case in QC,REMEDY

Environment: Informatica 9.1.0,C++,Windows 2000, UNIX, Oracle 9i, Toad, Business Objects

Confidential, SFO-CA

Sr. Informatica Developer

Responsibilities:

  • Involved in analysis, requirements gathering, function/technical specifications and development, deploying and testing.
  • Prepared LLDs based on the HLDs to meet the business requirements
  • Created Informatica Mappings to load data using transformations like Source Qualifier, Aggregator, Expression, Router, Union, Joiner, Connected and Unconnected lookups, Filters, sequence generator and Update Strategy.
  • Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities.
  • Extracted source data using Power exchange from IMS DB.
  • Used the Debugger in debugging some critical mappings to check the data flow from instance to instance.
  • Developed the mapping to pull the information from different tables and used SQL Override to join the tables instead of Joiner transformation to improve the performance
  • Created parameter files in UNIX to run the workflows.
  • Scheduled sessions to update the target data using Workflow Manager.
  • Developed all the mappings according to the design document and mapping specs provided and performed unit testing.
  • Reviewed and validated the ETL mappings and data samples loaded in the test environment for data validation.
  • Incorporated policy/rule/plan for sensitive elements to be masked using ILM (Information life cycle management) tool
  • Performed tuning of Informatica sessions by implementing database partitioning, increasing block size, data cache size, sequence buffer length with the help of DBA's, target based commit interval and SQL overrides.
  • Performed data validation, reconciliation and error handling in the load process.
  • Worked in Code migration from Informatica 8.6.1 to Informatica 9.0.1
  • Performed unit testing to validate the data loads in different environments
  • Resolved the defects and updated in quality center which are raised by the QA team.
  • Provided support for daily and weekly batch loads

Environment: Informatica Power Center 9.0.1/8.6.1, Power exchange 9.0,Windows 2000, UNIX, Oracle 9i, Postgres, Netezza

Confidential

ETL Lead/Admin

Responsibilities:

  • Prepare Technical Design Doc for RX Claim Subject Area
  • Participated in Logical Data modeling sessions
  • Design of the ETL solution framework
  • Installed DataStage 7.5.2 on UNIX server for Development, Test and Production Environments.
  • Installed /Upgrade DataStage EE server (version 7.5.2 ) and various DataStage PACKS
  • Installation /Upgrade DataStage EE server (version 7.X and above) and various DataStage PACKS ( SAP, BW now).
  • Trouble shooting the ETL development environment
  • Setting the ETL Development objectives, Standards and Guidelines, Template for specification, System Design Document fulfilling the developing requirement

Environment: Ascential Data Stage 7.5.2 (Server and Parallel Extender), QualityStage,C++,Windows 2000, UNIX, Oracle 9i, DB2, Toad

Confidential

Production Support Engineer

Responsibilities:

  • Re-architected the existing ETL jobs with Restartability, Error Handling and failure Notification using ETL best practices frame work
  • Tuned ETL jobs for performance improvement
  • Implemented advanced survival rules to eliminate data quality issues.
  • Implemented advanced matching rules to eliminate data quality issues.
  • Written Ms-Dos Batch scripts for input files validation and to kick off Data stage jobs using parameter file instead of environmental variables.
  • Written complex views for Quality Stage and Stored procedures for complex business rules.
  • Responsible for direct customer interface supporting for Production environment
  • Suggesting and making the environment to work on a framework defined, thus having better scalability and maintainability of the system
  • Developing a know-how for the customer to understand on possible enhancements and re-writing of the ETL code to improve efficiency, scalability, and maintenance
  • Created a sound foundation with the client in terms of relationship building and gaining confidence on getting future projects
  • Working on multiple project proposals for my employer to propose cost-effective solution to the client in onshore-offshore model concept

Environment: Ascential Data Stage 7.5.2 (Server and Parallel Extender), QualityStage,C++,Windows 2000, UNIX, Oracle 9i, Toad

Confidential

Tech Lead/ETL Architect

Responsibilities:

  • Led the development and implementation team
  • Demonstrated knowledge of ETL implementation process, from business requirements through logical modeling, physical database design, data sourcing and data transformation, data loading, SQL, end-user tools, database and SQL performance tuning
  • Developed very important and complicated modules (Derive Service Group Code, Derive INBP, Derive CCT, Derive FQA, GL Posting and EOM -10).
  • Design and implemented solution to meet 7 am dead line which is very important requirement of the project.
  • Design and Developed common Error Management using DataStage Parallel Extender at organization level.
  • Architecture, Design and Developed organization level FQA Engine using C++ custom stage (which is not supported by DataStage framework) to support FQA tagging for financial systems.
  • Implemented Restartability Frame work.
  • ETL jobs have been tuned for better performance
  • Developed and implemented automation of loading FND LOOKUP tables for various source systems
  • Created Stored Procedures for better performance
  • Provided DataStage production implementation solutions
  • Worked with each source system Technical Leads and came up with solutions to avoid contention and performance issues
  • Involved in designing and implementing Best practices. The Best practices include restartablity, Recovery, Parameter standardization and Capacity planning etc.
  • Developed DataStage After Job Routines to retrieve statistics of the process and insert to an Audit table defined in the target
  • Optimized DataStage jobs utilizing parallelism, partitioning and pipelining features of Parallel Extender
  • Wrote complex Shell Scripts for executing FTP and PL/SQL stored procedures with exception handling

Environment: Ascential Enterprise Data Stage 7.5 (Server and Parallel Extender),C++,Windows 2000, UNIX, Oracle 9i, Toad

Confidential

ETL Engineer

Responsibilities:

  • Converting the business rules into technical design documents forETLprocess populating the Fact and Dim table of data warehouse
  • Responsible for designing and implementing ETL Frame work
  • CreatingETL Jobsto extract the source data from flat files and to load the data in to the Staging area and Fact and Dim tables ofData warehouse.
  • Working with stages like ODBC, OCI, Hash-File, Sequential File, Sort, Aggregator
  • Written Custom Routines for ETL Frame work
  • Tuned DataStage jobs for better performance by creating DataStage Hash files for staging the data and lookups
  • Developed processes for extracting, cleansing, transforming, integrating and loading data into Datawareshouse using DataStage Designer
  • Automated the dailyETLprocess.
  • Scheduled the ETL jobs daily. Weekly and monthly based on the business requirement.

Environment: Ascential Data Stage 7.0 (Server and PX),Windows 2000, UNIX, Oracle 9iToad

Confidential, TX

DW Architect / Onsite Lead

Responsibilities:

  • Led/mentored the onsite and offshore team
  • Prepared ETL Design templates and assigning work to offshore
  • Converted the business rules into technical specifications forETLprocess populating the Fact and Dim table of data warehouse
  • Responsible for creating and testing ETL processes composed of multiple DataStage jobs using DataStage Job Sequencer and shell scripts.
  • CreatedETL Jobsto extract the source data from flat files and to load the data in to the Staging area and Fact and Dim tables ofData warehouse.
  • Worked with stages like ODBC, OCI, Hash-File, Sequential File, Sort, Aggregator
  • Tuned DataStage jobs for better performance by creating DataStage Hash files for staging the data and lookups
  • Developed processes for extracting, cleansing, transforming, integrating and loading data into Datawareshouse using DataStage Designer
  • Tuned theperformancefor the existingETLstored procedures and reduced the load times from several hours to few minutes.
  • Automated the dailyETLprocess.
  • Tuned thePerformanceforETLjobs by splitting the Transformations and writing into files and loading into database.
  • Scheduled the ETL jobs daily. Weekly and monthly based on the business requirement.
  • CreatedViewsfor use in Impromptu reports and stored procedures.
  • Created stored procedures inPL/SQLto createImpromptureports.
  • Created ComplexImpromptureports usingPower Prompt Applications
  • Created Complex External rollupCognosCubesusingPower Play Transformer
  • Created security filters usingAccess Managerfor Cognos to provide the access on reports based on the role of the user viewing the reports.
  • Extensively use theChange Managementprocess to move code across environments.
  • Convert existing IWR reports to ReportNet

Environment: Ascential Data Stage, Cognos EP series 7(Impromptu, Transformer, Power Play, IWR, Upfront, Access Manager, Cognos Script),Windows 2000, Unix, Oracle 9i, Toad

Confidential, AZ

DWH Developer

Responsibilities:

  • Involved in study of existing operational systems,data modelingand analysis
  • Translated business requirements into data mart design.
  • Define the entity-relations - ER Diagramsand designed the physical databases forOLTPandOLAP(data warehouse)
  • Involved in study of existing operational systems, data modeling and analysis
  • UsedDatastageas ETL tool for building the data warehouse.
  • Extensively usedAscential Datastagefor creating Datastage jobs.
  • Developed jobs usingDatastageDesigner, validated, scheduled, run and monitored the Datastage jobs.
  • Worked on programs for scheduling data loading and transformations usingDatastagefrom flat file to Oracle.
  • Created the PL/SQL stored procedures for extracting the data from source tables and load the target tables of Data mart.
  • Tuned the performance for the ETL stored procedures.
  • Createdstored proceduresto validate the data before loading data into data marts.
  • Createdstored proceduresto create reports in impromptu with stored procedure template.
  • CreatedData ModelUsing ERWIN.
  • Written UNIX scripts to invoke Sequences and jobs.
  • Creating catalogs, folders and reports with Cognos Impromptu.
  • Used Impromptu features such as sub-reports, Drill through and charts.
  • Creating catalogs, folders and reports withCognos Impromptu.
  • Created different reports using Impromptu.

Environment: Ascential Data Stage 6.X, Unix, Windows2000/NT, Oracle 8i, UDB2, Cognos Impromptu 5.0, Cognos Transformer and Power play 6.5, IWR. Cognos Script, Actuate, ERWIN 3.5.2

We'd love your feedback!