We provide IT Staff Augmentation Services!

Team Lead / Etl Architect / Data Migration, Bigdata Analytics, Consultant Resume

5.00/5 (Submit Your Rating)

Houston, TX

SUMMARY:

  • Over 15+ years of Information Systems experience including Retail, Utilities, Insurance, Finance, Banking, Credit Reporting, Health Care, and State FAMIS systems. 10+ years experience in Business Data warehouse ETL/BI environments using Informatica 9.x, 8.6, 7.x, ABAP, SAS 9, Oracle 8i/9i/10g/11g, PL/SQL, DB2/UDB, Platinum Toolset, Cobol, C.
  • Experience in Data Extraction, Transformation and load ( ETL ) using Informatica 9.x, 8.x, 7.x, Power Exchange andknowledgeable in SAP BusinessObjects Data Services XI3.2 and BODS 4.0
  • Experienced on mainframes (UNIX/Linux/MVS, COBOL/C, DB2/ORACLE, and CICS) as well as client/servers in a heterogeneous network with multiple DBMS gateways (Netezza, DB2, ORACLE, INFORMIX, Sybase) and build interfaces with ERP applications PeopleSoft, SAP/Oracle 10g DB etc. Also worked on Novell, IBM - PC OS/2 LAN and Windows NT.
  • Ported and tested DB front end tools developed in 'C' on a Unix Mainframe platform to SUNOS OSF/Motif window manager, DG-Aviion, HP-UX, Apollo/Domain, hpwm HP window manager, DEC-ULTRIX, SCO-UNIX, NOVELL NETWARE and built interfaces to ORACLE, INFORMIX, InterBase etc.
  • Conducted training programs for various Users, Trainees and Management professionals in the area of SAS, ETL, RM COBOL, MS COBOL, UNIX & ‘C’, ISAM
  • Developed C/Oracle database interface of School Admission modules
  • Provided technical support to convert Mainframe MVS/COBOL based system into UNIX ‘C’ Oracle DB system.
  • Trained in SAP Business Objects BODS 4.0, SAP BusinessObjects 4.1 (UDT, IDT, WEBI, and Crystal Reports), ABAP, Bigdata Analytics platform (1010data), DMExpress DMX-h Hadoop Sort and ETL tools.

PROFESSIONAL EXPERIENCE:

Team Lead / ETL Architect / Data migration, Bigdata Analytics, Consultant

Confidential, Houston, TX

Responsibilities:

  • Provide expert knowledge and solutions on building TenUp64 unixODBC based ETL framework to push Netezza EDW Data to 1010data BigData Analytics reporting platform.
  • Build automated scripting to perform Cross-DB subject area data validations across disparate databases such as Netezza EDW Data warehouse and 1010data big data analytics platform using shell scripting, TenUp64 and Tendo utilities.
  • Perform 1010data Analytics ETL Application Maintenance for multiple development projects including:
  • Install, upgrade, config and maintenance of 1010data’s TenUp, TenDo utility packages and 1010data’s sql1010odbc driver software package
  • Setup and maintenance of 1010data specific unixODBC based SAP and SUS (Legacy) DSNs (For Database Connectivity) to source Netezza, Oracle and DB400 databases.
  • Perform Linux admin activities necessary to support 1010data applications such as to move/copy/link libraries and organize user bin files, 1010 application security related files including encrypt/decrypt files, libraries needed by 1010data application Linux libraries
  • Perform TenUp code deployment and troubleshooting activities to debug / trace etc. with/without Linux debug utilities)
  • Sync 1010data application specific software modules and code across AWS production and AWS failover servers
  • Browse/review Linux core system logs /var/log/messages etc. and to research TenUp untrapped error message (OS kill messages, SIGPIPE) logs etc.
  • Setup RSA key based communication to push GPG (Gnu Privacy Guard) encrypted single sign-on IDM security data to 1010data platform for Customer Reporting
  • Mentor team members on all activities of the project
  • Work with various teams/ support groups including offshore in collaborative working sessions
  • Design and create/code mappings of legacy data from SUS, MVS, Netezza and flat files to push data to the Big data analytics platform
  • Document and extract business rules from ETL Jobs using Informatica 9.x, Netezza, and FTP'd Mainframe Flat files
  • Support/Setup and perform Linux Admin activities on the TenUp Linux server
  • Setup Netezza DB Connections creating ODBC DSNs on TenUp server
  • Setup/Install unixODBC and test verify iSQL connectivity
  • Support 1010 team in testing the TenUp64 and TenDo utilities
  • Interact with other teams and EDW support group
  • Provide level of effort estimates
  • Develop project plans and work allocations by resource
  • Create Change Request Documents
  • Develop application ETL code, Shell scripts, SQL scripts and jobs
  • Built the Daily batch load status reporting process
  • Implement monitoring via audit log text file
  • Implement data validation scripting iSQL scripting processes.
  • Implement TenUp logs purge / archival process
  • Create / Raise tickets with CSG support desk team as necessary
  • Create and Run adhoc Data history catchup job scripts, sqlscripts as needed and necessary
  • Schedule and conduct internal Team meetings as well meetings with 1010 and other support teams
  • Perform Data Analysis activities
  • Perform Data Validation activities create/Fill TLRFs as necessary
  • Create and update internal 1010 issue trackers

ETL Architect/Lead CustomerDB / Digitized Transactions, Contractor

Confidential, Cedar Rapids, IA

Responsibilities:

  • Worked with Program manager and project manager/Architect to streamline tasks and their dependencies for building a Go Green plan.
  • Participate in Data analysis and mapping exercise from Mainframe & AS400 Legacy applications and ODS to new Oracle based single Customer Database.
  • Work with other groups including offshore in collaborative working sessions
  • Design and create/code mappings to extract data from Mainframe PDS, DB2 etc. to Oracle DB
  • Developed ETL Jobs using Informatica 9.x, Oracle 11g, z/OS, DB2, FTP Mainframe Flatfiles

ETL Data Consultant

Confidential, O’Fallon, MO

Responsibilities:

  • Worked with Program manager and project manager to streamline tasks and their dependencies for building the Integrated Data Repository to replace current Mainframe based DB2 ODS system used for Portfolio Viewer Reporting.
  • Participated in Data mapping exercise from Legacy applications and ODS to new EXADATA (Oracle) based Data Repository.
  • Work with other groups to setup and Install SyncSort ETL Servers
  • Installed DMExpress DMX-hHadoop Sort and ETL tools and configure
  • Attended corporate in-house training conducted by Vendor SyncSort on DMExpress DMX-h Hadoop Sort and ETL tools
  • Design and create/code mappings to extract data from PV DB to Oracle Cloud
  • Developed ETL Jobs using Informatica and DMExpress

Sr. BI Consultant /Legacy CIS+ to SAP CR&B Informatica ETL Team Lead

Confidential, Boise, ID

Responsibilities:

  • Migratetransformed Legacy CIS+ Electric Utilities Data to SAP ISU CR&B system and EDW.
  • Map Legacy CIS+ Electric Utilities Data to SAP ISU CR&B system and Enterprise Data warehouse.
  • Architect and Design the creation of SAP EMIGALL data load files from Legacy CIS+ electric utility system data.
  • Performed customer utility data profiling for various CIS+ data tables on the Oracle staging area.
  • Design and create/code ETL mappings to extract data from Oracle staging area
  • Manage offshore Informatica data extraction resources to extract and create SAP Emigall format flat files using Informatica power center 9.x and PowerExchange.
  • Mentor junior team members in writing sequels and data analysis, verification and error resolution

Sr. Consultant /IDEA DataWarehouse BI Team

Confidential, Chicago, IL

Responsibilities:

  • Architect and Design APOC global sales to develop ETL processes to load data into the IDEA data warehouse for Global Sales.
  • Architect and Design APOC global sales to develop ETL processes to load data into the IDEA datawarehouse for Global Sales.
  • Performed customer data profiling for various data sources using DQ tools.
  • Design and code ETL mappings to load Customer ADD ODS datastore to feed Siperian MDM tool to perform data cleanse and standardize the customer subject area.
  • Contributed to the MDM Siperian proof-of-concept project by designing and developing various ETL data extract processes.
  • Designed and developed JAPAN EP (Economic Profitability) project tables and data extract processes using PowerExchange connectors for DB2/400.
  • On ADD/ODS project, Mentor junior team members and QA/testers in writing sequels and data verification and analysis.
  • Design and convert Latin America BPCS data extract processes to Latin America SAP load processes Jobs and processes for Data Extraction and Transformation.
  • Developed Informatica mappings to populate the new IDEA Datawarehouse business layer core dimension and fact tables.
  • Created shell (Korn) scripts to extract and FTP transfer data across the enterprise (Mainframe to Linux/Unix and Windows server systems). Also, develop scripts to be run by AUTOSYS.
  • Migrate and upgrade the global sales ETL business processes developed in Informatica Power Center 7.x to Informatica 8.1 and Informatica 8.5 and 9.x.
  • Design and develop ETL daily and monthly FACT loads for the BI Re-Architecture project on the Informatica 8.6 GRID implementation.
  • Created detailed documentation of design specs, Test Plan and Data Interface and metadata documents for the Informatica Loads to Oracle 10g/11i based Data Warehouse.

Sr. Analyst Technical Lead / Sr. Consultant

Confidential, Chicago, IL

Responsibilities:

  • Created custom data mapping for new clients to the common CSIF formats.

Sr. Consultant / Data Warehouse BI Team

Confidential, Bowling Green, KY

Responsibilities:

  • Architect and Design SAS Microsoft DDE processes to push data as well generate corporate Sales forecasting reports for both Domestic and International Sales.
  • Mentor junior team members to Design a Star Schema for the PKMS application in the corporate Data warehouse.

Technical Advisor / Sr. Consultant

Confidential, Chicago, IL

Responsibilities:

  • Design and ModifySAS processes to push data via FTP on to EDW (Enterprise Data Warehouse).
  • ModifySAS beneficiary cross reference processes to create denormalized cross reference for the CDW (Chicago Data Warehouse).
  • Develop and implement contingency plans for the Informatica Loads to Chicago Data warehouse.
  • Post-Implementation support for ETL and Data feeds into Zeus and CDW and also for Informatica staging the data in corporate EDW environment / Oracle 10g based Data Warehouse Design.
  • Develop and Implement the Chicago Data Warehouse data purge and archival process of claims data for the BRT Re-Use.
  • Enhance and modify the existing LOAD processes to (CDW) Chicago Data warehouse and Zeus Prod environments.

Sr. Data Consultant

Confidential, Pasadena, CA

Responsibilities:

  • Enhanced CLINIC utilization processes by creating SAS datasets from mainframe sequential data extractions.
  • Southern California KaiserSAS clinic utilization reporting and analysis.
  • Enhanced & customizedCLINIC utilization processes for Analytical reporting.
  • Managed development and integration ofSAS/DB2/Teradata based ad-hoc reporting requests.
  • Streamlined production batch processes.
  • Modified production jobs resulting in reduction in length the processing time and updated data availability.
  • CreatedSASdata restore process shell scripts & batch jobs.

Sr. Subject Matter Expert - Consultant

Confidential

Responsibilities:

  • Enhanced customizedprocesses on mainframe for data extractions as flat files from Legacy Database i.e., Migration of Customer Data to Cingular’s Telegence System from the Existing AT&T Legacy System.

Sr. Consultant

Confidential

Responsibilities:

  • Worked on the Bank Card Portfolio Risk Reporting & Analytics MIS Project.
  • Assisted RR&A programmers in developing aSAS/Excel parallel version of new Business Objects risk-MIS reports’ graphs and tables for a risk-adjusted financial performance metric.
  • Developed new Consumer Bank Card Portfolio Management MIS Reports using the Data warehouse Profit Tables data.
  • Developed the process of creation of BC RAM (Risk Adjusted margin), RAROC (Risk Adjusted Revenue on Capital) Data Tables and Charts usingSAS data extract from mainframe DB2 data warehouse.

Sr. Consultant/Project Lead

Confidential

Responsibilities:

  • The new REO process automated the data extraction and reporting by revising the process to use Corporate Data Warehouse (CDW) instead of the legacy IBM mainframe datasets.
  • Created enterprise wide restatement relatedSAS accounting reports and made them available to end-users thru ReportWorks.

Sr. Consultant

Confidential

Responsibilities:

  • Managed development and integration ofSAS/DB2 based ad-hoc reporting requests for Hospira and Abbott.
  • Managed development and integration HR application systems for Hospira similar to those of Abbott.
  • Cognos Impromptu based ad-hoc reports for Hospira and Abbott.

Sr. Technical Consultant/Project Lead

Confidential - AR

Responsibilities:

  • Involved in the ongoing delivery of migrating client mini-data warehouses or functional data-marts from Oracle Environment to Teradata.
  • Designed ETL process to extract data from DB2UDB and Oracle to Teradata and performing data cleaning, transforming and loading for Web reporting.
  • Fine tune and improve performance of workload process of RevenuePlus system that runs for about 27 hours. After fine-tuning, the process runs in about an hour.
  • Mentored Cognos impromptu developers.
  • Provided solutions for Remote execution of Cognos Impromptu reports across the enterprise.

Technical Consultant

Confidential

Responsibilities:

  • Developed and tested SCORE interface with StrategyOne system.
  • Enhanced the concurrent processing of Data.
  • Enhanced SM1000 name matching.
  • Downloaded data from Legacy Mainframe Systems using COBOL and JCL.
  • Loaded the Flat files from two Oracle databases using SQL* Loader.
  • Enhanced trade Data Suppression criteria.

Technical Consultant

Confidential

Responsibilities:

  • Developed Java/C Native interface, JSPs, Servlets, JavaScript and app.
  • Responsible for testing, troubleshooting and fixing bugs.
  • Enhanced and maintained Oracle 7.0 based database bug tracking system.

Technical Consultant

Confidential

Responsibilities:

  • Responsible for Java application development, Integration and Testing.
  • Worked as Technical team lead, coordinated Java development and implementation of financial credit reporting system modules for Hong Kong.
  • Coordinated application development activities between Hong Kong and International Team.
  • Integrated modules developed by Hong Kong and Indian team into common framework for deployment.

Technical Consultant

Confidential

Responsibilities:

  • Responsible for AR/AP/GL/PO Customization, Testing and Implementation.
  • Designed and Created new ORACLE Database tables and wrote UNIX shell scripts and SQL/SQL*Loader scripts to Load the data from Legacy (COBOL, JCL, DB2, OS/390) System.
  • Wrote Interface SQR to load legacy data (Vendor, Item, Location, and UOM) into PeopleSoft ORACLE DB.
  • Customized SQR and GL/nVision reports (Cost Center statements Expense Trends and Budget Trends and Month-end Cash/Profit/Gain).

Technical Consultant

Confidential

Responsibilities:

  • Data Conversion/Migration: Upgraded the Investment accounting system from release 97.1 to 98.2 and upgraded the ORACLE database to v7.3.4.

We'd love your feedback!