Senior Etl/dw Architect And Developer Resume
Mclean, VA
PROFESSIONAL SUMMARY:
- Over 18 years in IT experience including over 15 years of experience in DW/BI Architect and Engineer.
- Experience on all phases of Data Warehouse/BI (Tableau, Jaspersoft, Pentaho, Looker, OBIEE, Cognos) development, Testing, Implementation lifecycle from gathering requirements.
- Data Modeling (Logical and Physical Data Models) and Data Integration for EDW.
- Big Data Hadoop Technology (HDFS, Map Reduce, Yarn, Hive, Pig, Sqoop, Oozie, Flume, Hbase, Spark)
- Expertise in designing and developing ETL processes, like TALEND, SSIS, Pentaho DI, and Informatica.
- Experience with AWS EC2/S3/EMR/Redshift/RDS on the Cloud Service.
- Familiar with Master Data Management (MDM).
- Programming and scripting (Unix script/Python/Java/xml) .
- Exceptional background in analysis, design, development, test, and implementation of DWH, BI Environment with Multi Dimensional, Slowly Changing Dimension(SCD) Type 1, 2, and 3 as well as Changed Data Capture(CDC) with Kimball methodology (Star/Snowflake Schemas)
- Designed MDM, ODS, Staging, Multi Dimensional, Fact, Data Mart tables.
- Demonstrated expertise on various ETL tools (SSIS, Talend, Informatica, Pentaho DI, Oracle DI, DataStage, Cast Iron and PL/SQL).
- Advised and consulted for technical groups on ETL best practices.
- Manage daily support activities to include Business Intelligence and Reporting.
- Excellent technical and analytical skills with clear understanding of business requirement
- Team lead/player with good communication and troubleshooting/problem solving skills
- Leads the implementation of effective solutions in data integration
- Strong analytic, problem solving, and collaborative skills.
- Experience working in Agile scrum methodology and Excellent analytical and technical skills
TECHNICAL SKILLS:
ETL Tools: Talend, Sqoop, Flume, Informatica, Python, Pentahio Data Integrator, SSIS Oracle Data Integrator (ODI), DataStage, Cast Iron
Development tools: PL/SQL, Toad, SQL Developer/Data Modeler, SQL*Loader, PostregsQL, XML, HTML, T - SQL, MySQL Workbench, MS Project, Visio, Powerpoint
Data Model Tools: Erwin, Visio, ER Studio, Oracle Sql Modeler, Visio
BI Reporting Tool: Tableau, OBIEE, Cognos, Jaspersoft, Pentaho, Qlikview, Domo, Looker
DB: Oracle, MySQL, AWS Redshift, PostgreSQL, MS SQL Server, and DB2
O/S: UNIX/Linux, Windows NT, Mainframe (IBM)
PROFESSIONAL EXPERIENCE:
Confidential - Mclean, VA
Senior ETL/DW Architect and Developer
Responsibilities:
- Designed/Developed Talend ETL Framework and jobs to load data from text & Json filesas well as RDB to EDW(RDB) and Data Lake (AWS Cloud S3).
- Set up Job conductors & Execution Plan (Job Sceduling by Trigger) on Talend Admin Conter(Enterprise Edition).
- Converted successfully DTS to SSIS packages on MS SQL Server.
- Developed SSIS Packages to load data from flat files to the tables.
- Established/designed Virtual data warehouse for integrated Credit Union Analytic Model.
- Setup several JDBC/ODBC data connections for source systems to Denodo (Data Virtualization).
- Create Source to Target Mapping Documents(STTM), Data Lineage and Data Models for Purchased/Student Loan.
- Designed/Developed Talend ETL Framework and jobs to load data from text & Json filesas well as RDB to EDW(RDB) and Data Lake (AWS Cloud S3).
- Set up Job conductors & Execution Plan(Job Sceduling by Trigger) on Talend Admin Conter(Enterprise Edition).
- Designed/Built Business views integrated multiple sources data, like Credit Risk, Finance, Loan Service SFDC and Fraud Analytic Models.
- Created Visualized Dashboard and Reports in BI Reporting tool (Tableau).
- Initiate/POC Big Data Hadoop system, HDFS, HBASE, Hive, Pig, MapReduce, No SQL and Spark.
- Trouble shooting and Problem solving.
- Perform Agile scrum Methodology(PI Planning, EPIC, Sprint, Jira, Story, and Task)
Confidential - Boston, MA
Senior Data Architect/Developer
Responsibilities:
- MS SQL Server data warehouse modeling for EPIC Clinical data integration.
- Design Logical/Physical Data Model for Analytic Models like Finance Performance, Benchmarks, Patient Care, and Physician Performance.
- Developed SQL Server Integration Services (SSIS, SSAS), Oracle and Talend ETL jobs.
- Created Tables, Views, Triggers, Partitions, and Building Stored Procedures with T-SQL.
- Monitoring Job activity of Daily/Monthly Batch Jobs on SQL Agent.
- Configured Big Data Hadoop (Cloudera and EM) Installation and configuration on AWS Cloud as POC.
- Developed Hive, Pig, and Spark with Python scripts for Clinical/Patient/Physician Data Analytics.
- Enhance existing Business Models on Cognos Framework Manager as well as Query/Report Studio.
Confidential - Somerville, MA
Data Warehouse/BI Architect
Responsibilities:
- Gathering/Analysis of Business Requirements from internal Marketing, Customer service Teams.
- Data Mapping to integrate with the Operational Data and other systems.
- Rebuild existing Enterprise Data Warehousing Architecture.
- Build Staging, ODS, and DWH Tables with partitioning and parallel.
- Build Data Model based on Kimball Methodology with Start/Snowflake schemas.
- Design DWH Tables with Dimension, Fact, and Aggregation Tables.
- Create MySQL/RDS and Aurora DB Instances in AWS.
- Setup Hadoop Cluster and Name/Data Nodes with YARN for HDFS in Cloudera.
- Develop Big Data (MapReduce, Pig, Hive, Spark, Sqoop, and Flume) scripts.
- Develop ETL job to store unstructured data to NO SQL (Hbase).
- Unix/Linux shell scripting.
- Create Stored - procedures and SQL queries.
- Set up Data connection/Domain to various data sources in Jaspersoft Server.
- Creating BI Analytic Reporting (Ad hoc views, Reports, Dashboards).
- Created Talend Data Integrator/Big Data.
- Create new BI Analytic platform with Pentaho BI and DI.
- Managed Off-shore resources.
- Troubleshooting and Problem solving.
Technologies: BIG Data Hadoop Cluster (Hive, Pig, Spark, Python, Map/Reduce, Hbase) ETL Talend, MySQL, Workbench, Unix, Shell scripts., AWS RDS, Aurora DB, REST API, HTML, Json, Java, Java Script
Confidential - Boston, MA
Senior BI/DW Architect and Lead
Responsibilities:
- Gather Business requirements for reporting, dashboard from Business Groups.
- Build Customer and Referral Data Models (Conceptional/Logical/Physical Data Model).
- Big Data (Hadoop, Java MR, Pig, Hive ) on Hortonworks. Create functional and technical design documents in Confluence.
- Configure/Develop ETL Job using Talend(Data Integrator, MDM), PL/SQL, and SQL*Loader.
- Create/maintain batch jobs scheduled by Cron / Active batch. Develop Budget Planning and Forecast upload process to EDW. Create Materialize views, Dimension, Fact tables, Triggers, and sequences. Construct Master Data Management(MDM) and Reference Data Management(RDM). Develop Data Source, Business Model, filters and packages in Cognos Framework Manager(v10.2). Create BI Ad-hoc reports using like Cognos, Tableau, and Jaspersoft.
- Data Integration from multiple sources to EDW Environment.
Technologies: Hadoop Component(Pig, Hive, MapReduce, Tableau, Talend(DI, MDM), MS SQL Server, SSIS, Oracle 10g, PL/SQL, MySQL, Cognos v10 (Framework, Report Studio, WorkSpace), JasperSoft, Big Data Hadoop (HDFS, HBASE, YARN, Map Reduce, Hive, Pig)SQL Plus, Toad, SQL Developer Data Modeler, SQL*Loader, Linux, Windows, Active Batch, CronJob Scheduler, Materialize View
Confidential - Boston, MA
Senior BI/DW OBIEE Consultant
Responsibilities:
- Analysis Business user’s requirements and build Logical and Physical Data Models with Star/Snowflake Schema.
- Design and development OBIEE Analysis, Dashboard & BI Publisher Reports using OBIEE 11g.
- Build Repository and Hierarchy in OBIEE Admin Tool (RPD).
- Development and Enhancement to load/extract data from ODS to DW schema using PL/SQL and SQL scripts as well as assisting other team members.
- Support business operations via OBIEE: investing defects or issues, assisting with usage tracking metrics, and updating existing BI Publisher report, analyses, dashboards.
- Create Unix/Linux Shell scripting regarding Clear case and AutoSys job scheduler
- Ensure that all incidents and requests are resolved and handled promptly and in order of priority.
- Created Informatica Mapping Jobs to load data from Operational system to DWH.
Technologies: Oracle 11g, Informatica Power center, OBIEE, BI Publisher, PL/SQL, SQL Plus, Toad, SQL Developer, SQL*Loader, Unix, Windows XML, SharePoint
Confidential - Chelmsford, MA
BI/DW ETL Lead/Architect
Responsibilities:
- Worked closely with Business users to understand their data needs and translate requirements into ETL design and integration workflows.
- Identifies requirements, defines scope, performs cost benefit analysis, and develops conceptual specifications to address complex business issues and problems.
- Analysis and Design Data Modeling (Physical / Logical Data Model) for Revenue Forecast, Booking, and Sales Models.
- Design Functional/Technical Specifications based on the Business Requirements.
- Build BI Metadata Repository in OBIEE Admin Tool
- Build Answers and Dashboards for Revenue Forecast Model in OBIEE.
- Design Dimensional and Fact tables in Star/Snowflake Schema.
- Create Staging Area, DW and Aggregated, Data Mart tables with SCD Type 1, 2 and 3.
- Create Oracle PL/SQL, Materialized views and functions for ETL and OBIEE.
- Create Physical, Logical Architecture and Repository in Topology Manager.
- Create Data Model, Data Store and Project in Designer.
- Develop/manage ETL Informatica Repository, Mapping, Workflow in Batch Schedules/Ad-hoc Processing.
- Performance Tuning on ETL - Informatica mapping/workflows, PL/SQL, SQL scripts.
- Provide Strong Technical Leadership on projects with working independently.
- Possesses strong Data Warehouse/ETL Architecture concepts and knowledge.
- Build strong relationships with other project members and 3rd party consultants.
- Utilize 3rd party development tools for System/Data Integration.
- Demonstrates strong commitment to Continuous Improvement in the areas of process and technology.
- Troubleshooting and resolve end user issues and problems.
Technologies: ODI v10, Informatica Power Center v9, DAC, Oracle 11g, OBIEE, PL/SQL, SQL Plus, Toad, SQL Developer, SQL*Loader, Linux, Erwin, MS SQL Server, T-SQL
Confidential- Arlington, VA
DW/ETL Specialist
Responsibilities:
- Analysis and configure the ETL system to convert from Informatica to Datastage.
- Design and develop ETL (Datastage, Informatica), PL/SQL, Store procedures, configuration files, tables, views, and functions; implement best practices to maintain optimal performance.
- Advising other groups in organization on ETL development, data warehouse development, and ETL development best practices.
- Conduct code walkthroughs and review peer code and documentation.
- Build efficient ETLs and Test scripts for processing data load from financial system/Payroll/HR to EPM as well as 3rd Party system.
- Establish shell scripts/runsheets of ETL jobs for Batch Process via Control-M.
- Work with other developers, DBAs, and systems support personnel in elevating and automating successful code to production.
- Provide on-call support to production system to resolve any issues.
- Improved the performance on weekly data load process from 6 hours to 1 hour.
Technologies: Informatica Power Center v7, DataStage v7/8, Oracle 10g, PL/SQL, SQL Plus, Toad, SQL Developer, SQL*Loader, Unix, Erwin
Confidential - Boston, MA
Senior ETL Consultant
Responsibilities:
- Designed Data Model, ODS layers, Data Warehousing, and Data Mart Tables for Dashboard Scorecards.
- Developed Informatica ETLs to interface data from FIN/HR/Hyperion/CRM to EPM.
- Customized/Developed Application Engines, SQRs, PS Query, and SQL Scripts.
- Setup Record Metadata and defined Dimension/Fact Tables for DWH and Data Mart.
- Provided technical and functional support with production support for PeopleSoft EPM with HR, Financial, and CRM system.
- Identifying, defining, and implementing new processes as required to business needs.
- Reconciled the data b/w PeopleSoft Financial/HR/CRM and EPM.
- Troubleshooting system issues for PeopleSoft EPM and Financial 8.4 (GL, AR, AP, BI, Project) applications.
- Created Test scripts and results.
- Report Query writing /development, and maintenances
Technologies: Informatica Power Center v7, Oracle 10g, PL/SQL, SQL Plus, Toad, SQL Developer, SQL*Loader, UNIX, Erwin v7
Confidential
PeopleSoft Consultant
Responsibilities:
- Implemented PeopleSoft v8.3/8.8 (HR, Global Payroll Interface System, Benefits).
- Production support for interface processes via FTP, XCOM, and XML on Global Payroll System.
- Analyzed, designed Tech spec, and developed batch process (SQR), on line process.
- (PeopleCode, App Engine) for MPC Compensation, Job code change, Mass change, and Terminated EE list, etc.
- Enhanced Work Location, Expatriate EE Payroll, Personal data change, Cash awards, Cost Center with PeopleCode, PeopleTools, App Engines.
- Created new PS queries and updated existing queries for user’s needs.
- Resolved HR problem tickets, Troubleshooting, Performance Monitoring.
- On call production support.
Confidential
PeopleSoft/ETL Consultant
Responsibilities:
- Implemented on PeopleSoft EPM V8.8/CRMv8.4 - fixed and enhanced on issues and defects in CDM, Sales, Marketing, and Support modules.
- Developed/Modified Interface Job Processes from Existing Custom Data (CRM) into EPM (ODS Layer) via ETL (Ascential Datastage) tool for consolidated data - Incremental/Destructive.
- Maintenance interface process from 3rd party XML file to PeopleSoft EPM.
- Created Dimension, Fact, Error, Temporary tables for EPM Data Warehousing.
- Created Metadata, Table map, Data map, Data Loader Process with EPM (App Engine) Engine.
- Designed Technical specs and built Data loader processes from ODS layers into Data Warehouse Tables (Dimension/Fact Tables, etc) on Product, Issue, Advisor modules, etc.
- Developed/Customized Application Engines, SQR and Crystal Reports for marketing/sales.
- Created many Lookup Views and SQL objects for Daily/Weekly/Monthly Reports.
- Created run control pages/menu for the new custom reports and set up in process scheduler.
- Wrote unit test plan and support to functional analysts to create the PS Queries.
- Fixing critical defects and performance monitoring on data loading processes.
- Developed Black Berry (Mobile Sales) interface system between SQL server and PeopleSoft CRM.
- Successfully implemented for EPM v8.8.
- Created several complex reports for Marketing and Sales modules timely in PS CRM/EPM
- Participated development of Black Berry (RIM) interface modules for Peoplesoft.
Confidential
PeopleSoft Consultant
Responsibilities:
- Fixing critical defects and performance monitoring on data loading processes.
- Implemented the PeopleSoft HRMS v7.5/8.3 modules (HR, Payroll, Time & Labor, Benefits, Recruit Workforce) and Financial (GL/AM).
- Upgraded SQR programs, PeopleCodes, PS Tables for HRMSv7.5 to 8.3.
- Designed technical specifications, modified objects (Components, Pages, Table Structures, etc.).
- Developed/Customized of App. Engine programs for complex calculations or to change in business rules.
- Production Support for HR defect corrections/Data change requests.
- On call production support for PeopleSoft HRMS applications (HR, Payroll, Benefits Administration and Base Benefits, time and labor).
- Customized pages, components, menus for HR, Payroll, Base Benefits, and GL/AM, etc.
- Created multiple SQR reports, Crystal reports - HR, Payroll, Benefits, Financial reports (GL, AM).
Confidential
Programmer/Analyst
Responsibilities:
- Developed all application programs, mainly interface programs for Bank client to control and manage information of financial transactions transmitted between Bank and Finance company.
- Converted the Banking System (Customer Information Management) from Mainframe to UNIX.
- Upgraded and enhanced applications for GIC, Mutual Funds and Checking/Savings Accounts in general system of financial business.
- Modified applications for Management Information System and Personnel Management System.
- Processes involved analysis, design, specification, programming, menu displays and testing. Focused on analysis, coding and testing application programs as a main staff of the team.
- Programs were written in Oracle Forms/Report, Pro*C, PL/SQL, Pro*COBOL, Unix, and SQL*Plus using Oracle database on HP/UNIX.