We provide IT Staff Augmentation Services!

Software Engineer Staff Resume

New Hartford, NY

SUMMARY:

  • I am a certified, experienced professional specializing in the architecture, design, development and delivery of Analytics solutions.
  • I am seeking an architect/technical leadership role with an organization that shares my commitment to increasing business performance through analytics, and process improvement.
  • I have significant technical team leadership experience with Analytics projects using a wide variety of technologies within a large enterprise organization.
  • My daily activities in this role include: maintaining the implementation schedules, workload distribution, task assignment, mentoring junior team members, making architectural decisions, designing solutions, conducting peer reviews and performing development tasks.
  • Hands - on experience developing with SAP Predictive Analytics, R, Python, Confidential Data Lab, Anaconda and Jupyter Notebooks. Significant experience with Time Series forecasting methods. Completed Andrew NG’s Machine Learning and Deep Learning Specialization.
  • Hands-on experience designing and creating solutions with TIBCO Data Virtualization using TIBCO’s Best Practices.
  • Created Virtual Data Models combining data from SAP HANA, Impala and Microsoft SQL Server.
  • Experienced creating visualizations using Tableau 10 Desktop and Server versions. Supported the establishment of Tableau Server to HANA SSO using SAML authentication.
  • Extensive hands on experience with the development and design of Attribute, Analytical and Calculation views. Significant experience with SQL scripting in both scripted Calculation Views and stored procedures. Highly proficient at performance tuning HANA modeling. Performed SLT replication of hundreds of tables, skilled with alternate read methods such as type 1, 4 and 5. Configured SAP HANA Direct Extractor Connection ( Confidential ) with the BW sidecar approach. Expert level knowledge of SAP HANA security. Configured SLT replication using Oracle as a source system. Expert level knowledge of SDA and SDI real-time replication using OracleLogReader. Experienced with Agile Data Preparation.
  • Architecture, design and development experience with BOBJ tools including Data Services, Universe Design, Web Intelligence, Analysis for Office and Crystal Reports. Significant hands on experience with SAP Data Services 4.0 - 4.2 and Business Objects Enterprise BI 4.0 - 4.2. Hand-on experience with SAP Predictive Analytics 2.4 - 3.2.
  • Hands-on experience designing and developing solutions using Cloudera, Hue, Linux, Sqoop (1 and 2), HDFS, Hive and Impala.
  • Extensive hands on experience with all phases of the Software Development Lifecycle. Design and build of all BI 7.0 objects including InfoCubes, DSOs, Transformations, DTPs, InfoPackages, Infosources, Datasources, InfoObjects, MultiProviders, InfoSets, Open Hub Destinations, Process Chains, and Queries. Performance tuning including index and aggregate creation, infocube remodeling and OLAP cache usage.
  • Data sourcing from both SAP (ECC and SRM) and non-SAP source systems (flat files, Oracle and MS SQL Server).
  • ABAP customization of transformations, enhancement of ECC datasources and variable user exits in both backend and front end objects. In depth knowledge of Logistics Cockpit extractors.
  • Significant query development experience including calculated and restricted key figures, replacement path variables, user exit variables and exception aggregation
  • Formally trained in Microsoft 2008 SSIS and SSAS. Hands on experience with architecting and configuring the integration between SAP Business Warehouse and Microsoft SQL Server Integration Services. Developed SSIS packages and data flows to load SAP data into MS SQL Server.
  • Significant SSAS experience including design, development and performance tuning.
  • Excellent communication skills, strong root-cause analysis and analytical skills, ability to cross boundaries between technical and functional teams.
  • Exceptional at providing solutions to highly complex technical issues.
  • Expert level knowledge in many SAP application areas such as FI, CO, PP, IM, CATS, MM, PM, and QM.

SYSTEMS PROFICIENCY:

  • SAP HANA
  • SAP Business Objects XI 3.2 SP3, BI 4.0 SP 4, BI 4.1 SP 1 & 3, BI 4.2 SP1
  • SAP Data Services 3.2, 4.0 SP 3, 4.2 SP 1
  • BI/BW 7.0
  • ECC 6.0
  • Enterprise Portal
  • System Landscape Transformation DMIS 2010 SP7, SP8 and DMIS 2011 SP6 - SP10
  • Tableau 10
  • Confidential Data Labs
  • Jupyter Notebooks
  • SAP Web Intelligence
  • SAP Analysis for Office 2.2
  • SAP Predictive Analytics 2.4 - 3.0
  • SAP Lumira 1.30
  • Microsoft BI Stack 2008 R2
  • Unix, Linux, Windows, AWS, Data Lab, Oracle 11G & 12C, MS SSQL Server 2008 - 2016, SAP HANA 1.0, Hadoop/Cloudera 5.7, Anaconda 5, TIBCO Data Virtualization (formerly Cisco DV)
  • ABAP, Python, R, JavaScript, AJAX, ColdFusion, WebFOCUS, VB and C# .NET, MS Visual Basic, C++, PLSQL, LINUX/UNIX scripting, VBA, SQLscript

EXPERIENCE:

Confidential, New Hartford, NY

Software Engineer Staff

Responsibilities:

  • Leading the Space Systems transition to Confidential Data Lab. Created a developer guide for the Space data science team using Confidential ’s recommended best practices. Transitioned all existing projects from local develop into the Confidential platform. Configured an organization and environment to be shared amongst the Space data science team. Connected the Space Bitbucket repository. Published examples of connecting to various Space Systems data sources.
  • Designing various regression and neural network models for Liquid Apogee Engine to predict engine temperatures using a set of independent variables controllable by the pilot. Leading an effort to gather more meaningful and relevant data to improve results.
  • Developed and lead data collection and analysis efforts for Multitier Supply Chain Insights. Gathered requirements and desiged a solution for extracting Supply Chain data from unstructured sources and storing data in AWS S3 buckets. Lead the efforts to collect Bloomberg SPLC data into the Data Warehouse. Worked closely with Bloomberg to diagnose and repair multiple data anomalies. Collaborated on the design of a Markov Chain like solution for calculating the total exposure from one company to another using the intermediate tier data from Bloomberg.
  • Designed and developed a Cash Flow Forecasting predictive model for Accounts Payable and Accounts Receivable. Completed an in-depth comparison of various Time Series forecasting methods including Pyflux and Statsmodels for Python, Forecast for R and SAP Business Objects Predictive Analytics. Developed Jupyter notebooks using the most accurate method found from the comparison analysis. Explored various other methods such as Deep Learning using Keras with LSTM and MLP networks.
  • Established connectivity from Python to TIBCO Data Virtualization.
  • Architecting a Data Lake within Amazon Gov Cloud for Space Systems Data Science purposes that can communicate with applications and data sources contained within the Confidential intranet. Worked with security team to resolve all firewall issues.
  • Supporting Long Range Business planning applications using Monte Carlo Simulations on projected sales over the next 5-10 years.

Confidential

Analytics Architect / Technical Lead

Responsibilities:

  • Designed and developed ERP GPD Data Archive Reporting solution. Developed a Sqoop job using incremental loading to transfer data from the source system SAP HANA into the HDFS. Created a Hive/Impala table, a Business Objects Universe and Web Intelligence report. Created a Best Practices document for development when using Hadoop for EDW purposes. Work with Hadoop administrators to define security configuration and create an Oozie job to automate the data loading.
  • Architected and developed a solution using Cisco Data Virtualization that combines data from SAP HANA, Microsoft SQL Server and Hadoop using Imapala. Built 5 layers of views using the Cisco Data Virtualization Best Practices. Currently working on enhancing the Cisco DV Impala adapter to allow additional database function pushdown.
  • Completed a Shop Floor Optimization Proof of Concept using SAP Predictive Analytics. A Linear Regression algorithm was used to generate predictions for operation completion using SAP PA.
  • Performed an in depth analysis of Microsoft SQL Server In-Memory capabilities. Created identical scenarios across SAP HANA, Microsoft SQL Server 2014 and 2016 and benchmarked query performance.

Confidential

SAP HANA Architect & Technical Lead

Responsibilities:

  • Completed a Proof of Concept using SAP Agile Data Preparation.
  • Worked directly with SAP product management to provide feedback for potential improvements in product. Completed a procurement use case where SAP Agile Data Preparation was able to duplicate the functionality of Hyperion Brio
  • Transition the SSC EDW team into the new HANA modeling methodology using only Calculation views. Analytical and Attribute views are no longer being developed and replacements using Star Join and Dimension Calculation Views are being built.
  • Architected and developed the VE-CAMS SAP HANA Real-Time reporting solution. Configured SLT connection to the Oracle source database for VE-CAMS. Replicated 40+ tables into SAP HANA. Designed and built HANA models to meet the reporting requirements.
  • Supporting the configuration of a TDI SAP HANA system in the classified environment. Mentoring classified support people and helping debug issues as we migrate objects from our unclassified environment into the classified area.
  • Designed and developed a Travel Analytics solution SAP HANA, Business Objects Data Services and Analysis for Office. Created HANA models using the Star Join/Dimension Calculation View methodology.
  • Enabled SAP HANA auditing to track HANA model usage. Configured Auditing rules to collect useful data without accumulating too much data volume. Creating custom roles to allow access to Auditing data to IT personnel.
  • Designed and developed Material Earned Value Management Analytics solution. Worked directly with users to define requirements and support UAT. Designed and developed a complete end to end solution using SAP HANA, Business Objects Data Services and Web Intelligence.
  • Lead numerous upgrades of HANA 1.0 revisions from 39 - 112.06. Organized testing efforts and analyzed detected issues. Researched and developed workaround due to revisions changes and worked OSS messages with SAP when necessary.
  • Extensive HANA modeling experience with all views types: Attribute, Analytical, and Calculation (Graphical and Scripted). Designed generic use models for standard SAP application areas: FI, CO, PP, IM, CATS, MM, PM, and QM.
  • Installed and configured HANA Live 1.0 and Operational Reporting Rapid Deployment Solution.
  • Architected solutions for real-time Manufacturing and Procurement reporting using SLT replicated tables and HANA models. Designed trigger based solution to format SAP status codes into a reportable format.
  • Designed HANA “where used” report to track table usage in HANA models. HANA studio “where used” only works for modeling objects.
  • Continually holding design and development reviews to ensure all development efforts are conforming to Enterprise Data Warehousing standards. Eliminated “silo” based modeling by designing generic use models for multiple applications. Eliminated redundancy in modeling between multiple overlapping projects.
  • Created and instructed a HANA modeling training class for the development team.
  • Created a scripting process to create HANA accounts, enable SSO via Kerberos and SAML, and assign HANA roles based on Windows Active Directory Group Membership. This eliminated the need for manual maintenance of database accounts in HANA.
  • Architected a complete HANA security model. Created roles for individual applications, developers, production support, and administrators. Created Analytical Privileges to restrict data based on static dimension values
  • Developed a prototype for dynamic data security using stored procedures in the Analytical Privileges.
  • Architected a solution using Data Services and SLT to have the full SAP ECC long texts available in HANA in a reportable format.
  • Installed HANA eFashion universe for prototypes and easy reproduction of issues for SAP OSS messages.
  • Configured SAML authentication from BOBJ to SAP HANA.
  • Highly experienced with Data Provisioning using SAP System Landscape Transformation (DMIS). Performed replication of over 100 ECC tables. Significant performance tuning experience using alternate load methods Read Type 1, 4 and 5 to reduce initial load time or replication requests.
  • Configured SLT connection to an Oracle source system. Experienced with the intricacies of using SLT with an Oracle source system.
  • Created requirements and functional specifications for 3 use cases to demonstrate the functionality of SAP HANA.
  • Established Confidential with SAP HANA using the BW sidecar approach. Used Confidential to leverage SAP business content extractors from multiple SAP ECC systems.
  • Performed all necessary SAP BW configuration for Confidential .
  • Created process chains, datasources and infopackages to load HANA using Confidential .

Confidential

Business Intelligence / EDW - Architect & Technical Lead

Responsibilities:

  • Technical lead of all ETL processing for the Enterprise Data Warehouse. Researched and defined best practices with SAP Data Services. Implemented mandatory peer reviews prior to migration of ETL batch jobs.
  • Extensive experience creating SAP Data Services ETL jobs. Developed jobs, dataflows, workflows, and many different types of transforms. Sources include Oracle, SQL Server, HANA, SAP BW, SAP ECC and Flat files. Significant experience working with ECC source systems including consuming extractors, hierarchies, ABAP data flows, Remote Function Calls, and using the RFC data load method.
  • Hands-on experiences putting universes on top of HANA. Experience with advanced scenarios using SSO connections (Kerberos and SAML) and passing Universe/WebI prompt selections into HANA input parameters.
  • Implemented Business Objects Auditing universe and reports in version 4.1
  • Lead the upgrade of SAP Data Services 3.x to 4.0 and 4.0 to 4.2. Established direct connection to SAP ECC source system with SAP Data Services 4.0 and 4.2 to using SAP Business Content extractors, direct RFC connections and ABAP dataflows. Prototyped unstructured text processing using the new Text Analysis transformation with Data Services 4.0.
  • Supported SAP Business Objects Business Intelligence upgrade from 3.2 to 4.0 and 4.0 to 4.1. Coordinated testing efforts, tracked issues, research workarounds and managed OSS messages with SAP.
  • Analyzed requirements documents and created detailed system design for Travel Expense and Labor Reporting. Worked directly with end users to define requirements. Architected complete solutions for requirements using SAP Business Objects Data Services, MS SQL Server and integrated Excel connectivity for delivering data to the users.
  • Created Open Hubs using third party tools in SAP BW to extract data using Business Objects Data Services. Configured RFC destinations in SAP BW and connection to RFC destinations in Data Services.
  • Architected a Microsoft SSAS usage statistics solution using the OLAP query logging feature.
  • Designed and developed Microsoft SSAS cubes to support Travel Expense and Labor Analysis analytical reporting. Designed fact and dimensions tables in Microsoft SQL server database engine. Created complex MS SSAS cubes with multiple fact tables, many to many relationships, user defined hierarchies, and roles to limit data access based on dimension values and AD groups.
  • Developed a collaborative SharePoint site for Microsoft Business Intelligence solutions. Reporting is delivered using Excel Services with SharePoint web part pages.
  • Lead Architect of Equipment tracking Business Intelligence solution. Analyzed requirements document, created detailed system design and Technical Spec. Architected complete solutions for requirements using Informatica, Oracle, Universe Designer and Web Intelligence. Developed SAP Business Objects Universes using Oracle as the datasource. Developed SAP Business Objects Web Intelligence reports.

Confidential

SAP BI/BW - Enterprise Data Warehouse - Technical Team Lead

Responsibilities:

  • Lead Architect of NOMAD Mainframe transition to SAP BW. Analyzed requirements documents and created high level system design for over 30 reports. Architected complete solutions for requirements using SAP BW and Crystal Reports. Creating technical specifications and mentoring the FBM development team in the area of SAP BW. Developed the majority of SAP BW objects for including DSO, Infocube, Transformations, DTPs, Datasources, Infopackages, BEX queries, process chains, user exit variables and Open Hubs.
  • Configured Business Objects Enterprise XI 3.1 integration with SAP Netweaver. Imported SAP user IDs and Roles into BOE environment. Developed prototype solutions using Crystal Reports MDX and ODS connectivity. Identified Business Objects integration kit transports required by the LM SSC EDW project. Completed functionality validation of BOBJ tools including Xcelsius, Crystal Reports, Web Intelligence, Query as a Web Service and Universe Designer. Installed and Configured BOBJ client tools and client side integration kit.
  • Taught SAP BW 310 Enterprise Data Warehousing class to development team in preparation for upcoming transition to SAP BW.
  • Created standards documents and usage cases for Business Objects, SAP BW and Microsoft BI tools. Creating a strategy to guide the use of these tools for different types of reporting requirements such as Analytical, Fixed Format, and Dashboarding.
  • Created project schedules for each application area of development. Assigned resources to each area and delegated development tasks. Provided technical support to junior team members with development activities in IM, QM, CATS, CO, PP and PM. Conducted training sessions directly with end users. Managed the migration of transports to the UAT environment.
  • Analyzed Service Requests and provided accurate estimates. Worked with customers and development team to allocate resources to each request.
  • Held Procurement and Manufacturing Power User Training Sessions. Demonstrated the features of the SAP BW Query Designer. Educated the power user community on the available SAP BW PP and MM Infoproviders.
  • Development prototype for generating an SAP BW Hierarchy from an Oracle datasource. This hierarchy allowed users to run reports for Employee IDs by management chain.
  • Developed use case for Microsoft Business Intelligence tools in conjunction with SAP BW. Created Open Hub prototype using the third party interface to export SAP BW data into Microsoft SSIS. Configured RFC destination, Open Hub Destinations and Process Chains required by Microsoft BI. Created Microsoft SSIS packages to load the data from SAP BW into MS SQL Server tables.
  • Designed role structure for power and end user access by application area and analysis authorization to protect sensitive data at the Infoobject level.
  • Performed GAP analysis between legacy data warehousing systems and the SAP BI Business Content. Documented necessary enhancements and customizations to meet the reporting requirements of the Space Systems ERP users. Identified when to leverage a delivered extractor or create a custom version.
  • Created a complete technical specification for all of the Production Orders development. This includes designing a data flow, documentation InfoProvider data models and identifying necessary datasources.
  • Gathered Requirements by holding meetings with manufacturing end users and documenting results in the functional specifications.
  • Created an SAP scalable layered architecture data model for Production Orders (PP) reporting. All incoming data is held in staging layer Datastore objects.
  • Created a generic ABAP function module extractor to perform delta loads when there are 2 date fields necessary for normal delta loads, a creation date and a change date.
  • Conducted and documented peer reviews according with the Confidential Standard Engineering Process (SEP).
  • Developed all BI objects required for the Production Orders data model. Including installing Business Content, Enhancing logistics cockpit extractors, coding ABAP user exits, creating generic datasources, creating Datastore Objects and InfoCubes, transformations, DTPs, Infopackages, etc.
  • Lead the development of the SSC BI Overhead Orders implementation. Assisted in the creation of an enterprise data model for an Overhead Controlling solution including purchase orders, purchase requisitions and internal orders actuals. Delegated development tasks to the development team and assisted in their completion. Mentored junior team members.
  • Supported all defects detected during UAT and post Go-live. Promptly provided responses and corrections for the user community. Documented issues and resolutions in the SSC BI problem log.
  • Worked on the installation and stabilization of the Administration Cockpit. Created custom queries using Administration Cockpit Infoproviders. Analyzed and corrected errors in the Administration Cockpit delivered content.
  • Created the Overhead Orders technical specification detailing the creation of the entire reporting solution. Included design documents for DSOs, InfoCubes, Queries, Multiproviders and Datasources.
  • Created all required development objects including: extractors, user exits, DSOs, InfoCubes, Multiproviders, Transformations, DTP, Infopackages and Queries.
  • Wrote ABAP code used in transformation routines including end and start routines. End routines included SQL lookups of other DSOs and fiscal date conversion routines.
  • Installed all required business content and migrated datasources to 7.0 versions. Performed analysis of available business content for use in creation of this reporting solution.
  • Performed transport collection and transported all required objects to both the UAT and production environments. Debugged transport error codes and documents cause and resolutions.
  • Created and performed the Go-Live checklist.

Confidential

SAP BI/BW Project - Developer

Responsibilities:

  • Created nightly batch schedule for the LM P2P BI project. Analyzed each individual process chain for data loaded dependency issues and made corrections where required. Identified gaps in process chains including missing data loads and lack of proper system maintenance (clearing change logs, deleting PSA tables, infocube compression, etc.).
  • Performed performance tuning on several large volume reports. Created indexes on DSOs for faster data retrieval. Verified the use of these indexes by tracing the query execution plan. Created aggregates on infocubes. Remodeled several reports by rebuilding queries on infocubes, remodeling infocubes to keep dimension tables within 15% the size of the fact tables and creating line item dimensions when necessary. Eliminated the need for infosets by performing a 1 to many load to combine data from multiple levels of granularity.
  • Greatly improved data load performance by moving ABAP lookups to the start routine and storing required data in an internal hash tables.
  • Assisted in the architecture, design and build of an advance Enterprise Data Warehousing solution using the SAP recommended EDW model. Created staging DSOs which are used primarily to hold raw data from the source system. Designed second level DSO infoproviders to merge data from the various staging DSOs into a format which would satisfy the reporting requirements of the project.
  • Highly experienced in the use of Logistic Cockpit extractors, with a specialization in the purchasing extractors. Enhanced these structures using appends to the extractor structure and ABAP user exit coding to populate additional fields when necessary.
  • Created generic datasources from custom tables in SAP ECC. Worked with the ECC development team to ensure these custom tables could be delta enabled whenever possible.
  • Demonstrated expertise with the Standard Engineering Process by creating over 40 technical specification documenting end to end solutions to meet the reporting requirements of the business. Worked very closely with authors of functional specifications to identify gaps in the requirements, documented possible development issues using the standard SEP process and worked with the functional users to resolve these issues and provide a satisfying solution. Conducted peer, design, code and unit test reviews with the development and functional teams. Identified and documented potential performance issues with the functional specifications reporting requirements.
  • Architected a strategy to combine procurement data from different source systems into one common reporting solution. Used multiproviers, ABAP transformations and common infoobjects to combine procurement data from SAP ECC and a legacy Oracle Data Warehouse into a single corporation wide reporting solution.
  • Designed and developed complex open hub solutions for extracts to external systems. Merged data from multiple source systems into a single infoprovider to create a single consolidated extract for corporate wide procurement.
  • Created hundreds of custom infoobjects with master data attributes and texts. Enhanced delivered infoobject attributes. Activated and enhanced delivered master data extractors using both 7.0 and 3.5 paradigms.
  • Modeled and developed Infocubes based on reporting requirements. Identified infoobject candidates for line item dimensions, analyzed the data to create properly structure dimension tables and established logical partitioning.
  • Designed and developed PO and government contract number commitment infoproviders. These DSO objects track changes in dollar values throughout the life of a PO at both the PO line item amendment level and the government contract level. Incorporated the use of the revision table EREV to identify POs that have been officially released to the vendor.
  • Performed transport collection for all types of backend and front end objects. Diagnosed the cause of failed transports and worked to resolve the issue. Performed complete system collection for the creation of a new training system. Identified dependencies which cannot be collected via the transport connection (ex. Structures and functions used in transformation ABAP, ABAP code for user exit variables in DTPs, etc.)
  • Design and creation of over 60 query objects including restricted and calculated key figures, replacement path variables for date arithmetic, step 1 user exit variables to populate default values, step 2 user exit variables to process user input, and built custom web templates. Analyzed key figure calculation requirements and moved calculations to the backend when aggregation issues would arise from user navigation of the query characteristics.
  • Developed a series of custom extractors, DSOs and Infocubes for the ECC Readsoft bolt-on product.
  • Developed custom extractors, DSOs and Infocubes to provide a reporting solution to an ECC custom non-standard SAP PO account assignment process.
  • Documented data mappings between ECC and BI in detailed DSO design documentation. Further documented data mappings between BI infoproviders such as DSOs and infocubes.
  • Demonstrated expertise in all areas of transformations including ABAP coding start routines and end routine. Also performed all types of individual transformation routines such as ABAP coding, Master Data lookups, Formulas, etc. Performed a 1 to many load to a DSO by replicating the incoming records in the end routine based on a lookup of the key fields of the target DSO.
  • Developed a series of reusable ABAP function modules for legacy translation in transformation routines.
  • Performed 2 complete WebFOCUS environment upgrades on a corporate shared environment. One from WebFOCUS version 4.3 to 5.3 and the other from version 5.3 to 7.1.6. Performed root cause analysis on issues encountered as a result of these version upgrades and worked to resolve all issues. Worked directly with the WebFOCUS vendor IBI when required.
  • Developed a complete user auditing reporting solution using WebFOCUS Resource Analyzer, Web services and WebFOCUS ReportCaster. This solution would send automated email notifications to anyone who had not executed a report within the corporate mandated 180 days. Their access was removed if they did not run a report within 190 days per corporate policy.
  • Analyzed performance issues regarding the WebFOCUS resource analyzer volumes. Architected and built a solution that reduced the database storage requirements by over 90% without losing required reporting information. Presented this strategy at the IBI Summit (technical conference) in 2007.
  • Developed a complete WebFOCUS reporting solution called Precision Machining Capabilities. This reporting solution allowed user to search for suppliers based on their manufacturing capabilities and returned highly formatted PDF reports for printing.
  • Created SQL loader process for loading flat files into Oracle databases to improve reporting performance and data integrity.
  • Performed routine WebFOCUS development and administration activities such as master file creation, reporting code fixes, creation of new reports, creation of input web pages using a variety of web languages, process monitoring, batch job scheduling and file system cleanup.
  • Created a portable Javascript AJAX program that could easily be adapted for any website.

Hire Now