We provide IT Staff Augmentation Services!

Sr Consultant Resume

4.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY:

  • Contributes more than 25+ years of Information Technology experience. Contributes data warehouse, data mart systems, design, analysis, and conversion experience, and project lifecycle experience. Design, extraction, transformation and load experience involving multiple data sources across multiple platforms. Successful loaded large warehouse and data mart systems.
  • 12 years of Informatica - V9.61(BDE), 8, 7, 6
  • 1 year of Informatica IDQ
  • 1 year of Informatica MDM
  • 1 year of Big Data (Hadoop, Hortonworks, HIVE, AMBARI, ZOOKEEPER)
  • 2 months Pentaho Kettle
  • 16 years of DB2.
  • 1 year of GreenPlum
  • 5 year of Teradata
  • 8 years of IBM UDB 7.0 / 8.2
  • 12 years of Unix scripts
  • 8 years of SAS (Mainframe + Unix)
  • 12 years of Oracle.
  • One year of SQL Server
  • 6 years of IMS.
  • 15 years of COBOL, VS-COBOL II, JCL, VSAM
  • 15 years Data conversion, Data cleanup and application development.
  • Data Cleanup, Data Conversion (5 years)
  • Extraction and transformation from multiple data sources, three (3+) years IMS/DSL, DB2/DSL, COBOL/FS DSL and ORACLE/DSL and SAP.
  • Certified Master Consultant with ETI*EXTRACT experience (since 4/97).
  • Four months of Web Development experience in multi tier architecture.

TECHNICAL SKILLS:

Skills: SAP(SD / CD/ FI), SalesForce, INFORMATICA v 9.1, 8.6.0, 7.2, 6.2, “Informatica Cloud, SAP & Sales Force Integration”, FACETS 4.3, 5.2, SAS (Mainframe & Unix (9.1, 9.2) ) Hyperion Essbase OLAP 9 Tool . Cognos 8 Tool, ETI*EXTRACT v3.0.2, v3.0.3, v4.0, FX, COBOL FS, IMS DB, DB2 v6, IBM UDB 7.0, 7.2, 8.2, COBOL, COBOL II, PRISM Warehouse Executive v1.5, SQL, DL/I, CICS, VSAM, ISPF, MVS-JCL, VSE-JCL, PACBASE, DESIGN/I, INSTALL/I, IEF, MVS/ESA, MVS/XA, DOS/VSE, SSP, AOS/VS, MS-DOS, Unix, scripts, ORACLE, SQL-PLUS, Pro-C, ASP, HTML, VB. Big Data Horton, GreenPlum Database, HP Fortify.

Knowledge: Pentaho Data Integration Tool, Talend Data Integration Tool.

EXPERIENCE:

Confidential, Chicago, IL

Sr Consultant

Responsibilities:

  • Involve as Sr. ETL Consultant to convert PIMS database to Passport system database.
  • This is Onetime database conversion project.
  • Used Informatica Power Center, Informatica BDE, HIVE, AMBARI, YARN.
  • Responsible for Analysis, design, develop and implement the Informatica mapping.
  • Source and target database are using DB2. Used Informatica, Mainframe, Unix Scripts.

Confidential, Minneapolis, MN

Sr Consultant

Responsibilities:

  • Involved as Mainframe Security Consultant.
  • This was Multi-phase project to produce multiple documents related to Security Issues.
  • Used HP Fortify Security software for same.
  • Responsible for OWASP top 10 security issues.
  • Analyzed and produced following documents for phase I:
  • Security Requirements
  • Threat Modeling/Application Threat Profiling
  • Deprecated Unsafe Functionality
  • Secure Coding Standard
  • Static Analysis.
  • Delivered above mentioned document successfully as per project budget and timeline.

Confidential, Warren, NJ

Sr Consultant / Team Lead

Responsibilities:

  • Involved as Solution Architect to setup ETL batch job environment from scratch for Data Lake (Big Data) for Health Insurance, Wellcare, Tampa, FL. This project required to extract data from multiple source systems and load into Data Lake to data warehouse.
  • My responsibilities include Team Lead + Sr ETL and MDM Consultantant. Designed the system to get the data from multiple source systems and load into data lake to GreenPlum database. We used Infomatica Big Data Edition (BDE) and Power Center for ETL, Informatica MDM Tool for MDM, Linux BASH script and Hortonworks for Big Data. Used HIVE, HDFS, FLUME, AMBARI, YARN, Tableau, GreenPlum database to build the report.Used SQOOP to load data from Oracle to BigData Datalake.We also used Flume to load data into target tables.
  • Involved as Sr Informatica ETL Consultant to retrieve data from Salesforce, Oracle database and load into Oracle EDW for Verizon Telematics. Used Informatica IDQ to build mapplets for data validation.
  • Used Informatica MDM to retrieve the data for staging area and populate into MDM.
  • My responsibilities include Team Lead + Sr ETL and MDM Consultantant. Designed the system to get the data from multiple source systems and load into data lake to GreenPlum database. We used Infomatica Big Data Edition (BDE) and Power Center for ETL, Informatica MDM Tool for MDM, Linux BASH script and Horton for Big Data. Used Tableau, GreenPlum database to build the report.
  • This required getting the data from files and load into Oracle Data Warehouse. Did the Analysis, design and loaded the data into Data warehouse.
  • Used Pentaho Data Integrator, JAVA, Linux.

Confidential, Chicago, IL

ETL Architect

Responsibilities:

  • Involved in collection of Medical Students survey data, transform and load into Data Warehouse(DB2 database) and build report using Tableau.
  • My responsibilities included design, develop and implement Informatica mapping, session, and workflow. We used Informatica Power Center, Unix Kshell script for ETL.

Confidential, Detroit, MI

ETL Architect

Responsibilities:

  • Involve in Integration, Design batch process to run Batch jobs using Open Process Batch Scheduler.
  • Support offshore and onshore test team to achieve aggressive target date.
  • My responsibilities include Design core Facets, Informatica, Datastage, Cognos batch jobs, schedule test and implement in production.
  • Open Process schedule jobs between multi operating system.
  • We are using Confidential ’s Facets system, Datastage and Informatica ETL, Unix Scripts and Cognos reporting jobs.
  • Have good knowledge of Facets Healthcare data model.

Confidential, Chicago, IL

ETL/Data Warehouse Architect

Responsibilities:

  • Involved in Analysis, Design, Integrate SAP and SalesForce.
  • Lead team of 4 ETL member and used Agile project methodlogy.
  • My responsibilities included Analysis, design, develop and implement the system.
  • Did the setup Informatica Cloud, ETL program and load from SAP to SaleForce and vice versa.
  • Also design, analysis of ETL programs from SAP (MM, SD) to Data Warehouse.
  • System uses Oracle, SAP R/3 (MM, SD), ABAP using INFORMATICA V9.1, Informatica Cloud programs, UNIX Script, unit testing, and integration testing, performance tuning.

Confidential, Hartford, CT

Solution/ETL Architect

Responsibilities:

  • This was 4 weeks POC project to integrate SALESFORCE and SAP Cloud using Informatica Cloud.
  • Integrated data between SalesForce and SAP CRM.
  • Demonstrated and implemented jobs in cloud environment.
  • This project was integration between SAP SD and Sales Force.
  • My responsibility was to develop and implement Informatica cloud jobs.

Confidential, Denver, CO

Solution Architect / Team Lead

Responsibilities:

  • Involved as Solution Architect to setup ETL batch job environment from scratch. Also involved in Analysis and Development of interface between Facets (Health Care System) SAP FI (CD - Collection & Disbursement).
  • This project requires extraction of data from multiple sources viz. ORACLE, Flat files, SAP R3/BW tables and transform, build SAP Doc and load into SAP R3 / BW databases and vice versa. Involved in ETL (Extraction, Transformation and Load) design, resolves data model issue (10%), the mapping issues (10%), Business Rules issues (10%), SAP BW (10%). Involved in Data Modeling, ETL and BI Architecture committee to create standard procedures.
  • Used Water fall and Agile project methodology. Also involved in scrum and project status.
  • My responsibilities include ETL Team Lead + Informatica Admin + designing/developing ETL programs to create SAP BB, BG, GG DOC using INFORMATICA V9.1, 8.6, 7.3, UNIX scripts, PEARL, unit testing, and integration testing, implementation, production support, performance tuning.

Confidential, IL

Sr. Data Warehouse Consultant

Responsibilities:

  • This project required extraction of data from multiple sources viz. ORACLE, UDB DB2, QSAM files and transform and load into DB2 UDB on AIX environment.
  • Involved in ETL (Extraction, Transformation and Load) design, resolved the mapping issues, Business Rule, Data Warehouse design issues, Granularity issues, Data Warehouse Data Model issues.
  • Resolved lots of Data Warehouse Data Model Granularity issues because of extensive Data Warehouse experience.
  • Used SAS for financial reporting. Involved in JAD session for requirement gathering. Implemented OTS project and Supervision Archival project.
  • My responsibilities included designing/developing ETL programs using INFORMATICA V7.2, 6.2, unit testing using unix scripts and integration testing and implementation on AIX. Involve in following internal Project: a)Order Tracking System (OTS) June 04 March 05. b)Supervision OLAP Data Archival - March 05 September 05 c)Rate Tiering - September 05 - June 06.

Confidential, IL

Sr. Data Warehouse Consultant

Responsibilities:

  • This project required extraction of data from multiple sources viz. DB2, QSAM files and transform and load into DB2 UDB on AIX environment. Involved in ETL (Extraction, Transformation and Load) design, resolves the mapping issues, Business Rule, Data Warehouse design issues, Granularity issues, Data Warehouse Data Model issues.
  • Resolved lots of Data Warehouse Data Model Granularity issues because of extensive Data Warehouse experience.
  • We were extracting data from SALES REPORTING System, and loading the data into DB2 UDB database.
  • Used the SAS for Sales Reports & OLAP tool to build the front-end cube and reports to access the Database.
  • My responsibilities included ETL Team Lead + designing/developing ETL programs, UNIX scripts, unit testing, and integration testing on Mainframe as well as AIX.

Confidential, WA

Sr. Data Warehouse Consultant

Responsibilities:

  • Involved in ETL (Extraction, Transformation and Load) design, resolved the mapping issues, Business Rule, Data Warehouse design issues, Granularity issues, Data Warehouse Data Model issues.
  • Resolved lots of Data Warehouse Data Model Granularity issues because of extensive Data Warehouse experience.
  • We were extracted data from FACET System, Healthcare Management System, and loaded the data into DB2 UDB database.
  • Used the COGNOS OLAP to tool to build the front-end cube and reports to access the Database.
  • My responsibilities included designing/developing ETL programs using Datastage Suite, unit testing, and integration testing on Mainframe and AIX.

Confidential, CA

Sr. Data Warehouse Consultant

Responsibilities:

  • Involve in building Sales Data Warehouse Project on Window 2000 Server.
  • This project was used to compare Microsoft SQL Server performance, ease to build and run the data warehouse and DB2 UDB Database.
  • This project was 6 weeks short-term project.
  • Involve in installing DB2 UDB Database V7.2, Data Warehouse Manager, OLAP Server, and OLAP Tool (Hyperion Essbase OLAP Tool).
  • Utilized IBM’s ETL tool to move data from Microsoft Access Database to UDB V7.2.
  • Build the cube as per the clients requirement. Load the data into Cube and also did the performance tuning of DB2.

Environment: ETI 4.1.2 FX, COBOL/FS, COBOL/DB2 DSLs, COBOL, JCL, shell script, DB2, UDB, MVS, AIX, COGNOS.

We'd love your feedback!