- Innovative, insightful, tenacious. I am an agile, fast paced, technical, hands - on data integration specialist. An architect and project lead who can transform and standardize data to enable insightful communication and innovative reporting and analytics. Specializing in Informatica big data, data integration, master data management and data quality, with a working knowledge of Python & Spark (PySpark) and other new technologies. Now certifying as an AWS Solution Architect. Informatica specialist since 1997
- An expert designer and hands-on development Team Lead of data integration solutions with over 30+ years of experience, my specialty is to design and build the cleansed and standardized data sources for Big Data solutions, data warehouses, data marts, ODS data lakes, and today’s analytics applications. Learning new technologies daily.
- With 25+ years of hands-on design and development experience, in conceptualizing, designing, building and delivering big data systems, data warehouses, advanced data applications, standardized business analytics and business intelligence support systems while applying data quality best practices.
- I recently lead an Informatica Big Data project for an international financial and insurance firm utilizing onshore and offshore resources in the design, development, testing and delivery of an Informatica Big Data automated claims scoring and processing application. I designed and lead all coding, testing and deployment efforts for this system which handles the processing of claims for seventeen countries for this new business venture.
- Have lead master data and data quality initiatives across financial and insurance, supply chain distribution and integration with Salesforce. As a highly skilled data integration architect, I am able to quickly assess, remediate and transform multiple sources into concise analytics and reporting datasets using a best fit agile approach. Experience with high level Master Data Project deployment, as well as detail data steward and data standardization skills and methods. Certified as an Informatica Data Quality Professional 2008.
- Fast. Quickly able to quickly assess, remediate and transform multiple sources into concise analytics and reporting datasets using a best fit agile approach
- While I have significant focus with Informatica Technologies, I work in a vendor neutral zone of design and methodology - for big data HDFS, Hive, HDFS and ODS style Data Lake designs, as well as straight scripting, SQL, NoSQL. Newer training is in areas of reporting (Talend) and development (PySpark) and AWS.
- A proponent of iterative, agile project plans with proven PMI PMP skills.
- Skilled at listening and working with business users to help them identify core business needs and to interpret those into workable technology solutions.
- With projects spanning HDFS, Hive, and most relational databases from Microsoft SQL Server stack, Oracle stack, Teradata and DB2
- Relational projects include multiple petabyte initiatives
- Big data projects included a seventeen country claims reporting and analytics project which I lead as a hands-on designer, developer with both onshore and offshore resources.
- AWS experience with RDS, RedShift, DynamoDB via certification work 2016-2017. Streaming and batch and micro-batch data integration (ETL) experience.
- Most relational databases from MS SQL, Oracle, Teradata and DB2 as well as HDFS, Hive, HBase and other Big Data sources and targets.
- Anything data integration - from data warehousing to analytics and reporting; custom applications; to ERPs; to data warehouses and data marts; and today's big data systems and data and data lake designs using Inmon, Kimball, Adamson for dimensional, relational, hybrid and legacy-Big Data application solution designs.
- Skilled in the practical design of complex schedules and in the day-to-day enablement of the operations of multiple independent teams
- Experienced in accurate and timely assessment and prediction of future performance thru the monitoring of data using tools such as Microsoft Project
- Skilled at long range planning and working with clients to inform and help them react to changing technologies, budgets and requirements.
- Lead initiative to form Supplier Master at a worldwide retail food provider
- Project efforts began by establishing a MDM Process and Methodology, and then performing data mastering of supply chain for a first mastered system.
- Provided data transformation design and development to prepare analytics datastores using Talend DI for Big Data.
- Developed individual SQL scripts and Talend jobs to be used in complex 14 table transformations to be performed across Hive Big Data platforms.
- Provided data quality assessment and SAS scripts reengineering work to convert processes to MS SQL schemas and codebase.
- Assisted in the migration of the existing data store used for reporting into a more formalized data warehouse with industry standard schema design.
- Lead multiple project tracks for a leading Healthcare and Medicare Provider
- Established a baseline of Data Quality Metrics to provide inter-project collaboration with other teams’ work in Finance, Pharmacy, Disease Management, Population Health Management and related systems. This work involved the identification of key data resources, and the profiling and assessment of each table to alert development teams to its suitability as a source for future work.
- Work also included the capture and standardization of data quality expressions.
- Created an expansive Asset Inventory for IT spanning 11 major data and system classifications and 62 subgroups of 60,000 items. This inventory is used to jump-start new projects by providing comprehensive inventories of data tables, existing reports and analysis so teams can build on existing resources more easily.
- Developed and maintained a Project Dashboard to track and report data metrics and task progress.
- Due to nature of our onshore/offshore model, my roles included everything from user interviews, architecture, design, development of all initial code, and enablement (ie training) of offshore development teams, as well as final quality assurance, testing and acceptance processing.
- I lead nightly training for offshore team members to enable the knowledge and skills needed for our current projects.
- Developed methods and standards for expanding Data Integration onto the Confidential BigInsights Big Data platform using Informatica data integration, data quality and data profiling projects. Included alternatives/workarounds for current versions of HDFS, HIVE and other tools in the environment.
- Provided upgrade guidance and testing also from v95 to v9.6.1 for our client.
- Joined in-flight Confidential project to establish baselines for data quality around which the program was then built and constructed; worked with users to establish needed benchmarks of actual data quality versus the anticipated quality levels expected.
- Supplemented multi-platform DW and BI skills and manpower to existing teams as a 3rd data architect to handle application workloads on existing project work.
- Performed ETL and BI project oversight and guidance until it was determined that the additional workload anticipated due to a merging of 10 data centers into 4 was not imminent.
Senior Data Quality Analyst, Informatica Developer
- Performed Informatica PowerCenter, Informatica Data Quality and PL/SQL development for ongoing project during my transition and move from Washington DC to Denver CO.
Data Warehouse and BI Integration Lead
- Data integration lead working on multiple, very large OLTP and OLAP projects - designing, building and leading multiple data transformation, reporting and analysis, data integration, data quality, data profiling, Confidential, geocoding, identity resolution and consulting tasks to multiple end clients using best practices, products and methodologies of industry leading toolsets and databases.
- Many different and diverse assignments including FINCEN tasks.
Senior Data Quality Consultant
- Architected, developed and trained client on state-of-the-art profiling and data quality solution, named by TDWI Best Practice Award 2011.
- Began by troubleshooting and solving complex coding issues involving real-time Informatica Data Quality in a large real-time computing environment.
- Invited to perform short term triage by Confidential, and then asked to stay on through several contract extensions to implement corrections and new data quality handling solutions, tightly integrated with customized package code from Teradata Corporation.
- Environment was PowerCenter real-time implementation of PowerExchange, PowerCenter, Data Quality and Profiling and Teradata to drive customer recognition and new marketing initiatives. Streaming data was cleansed and standardized as it was received from the host system.
Architect / Informatica Data Quality Specialist
- Worked on end-of-year funded 30 day project to design, develop and implement an Informatica PowerCenter / Data Quality / Data Profiling 'core' application to better identify duplicate personnel, client, prospect and provider information. Implemented solution on-time, on-budget.
Principal Informatica PowerCenter and Data Quality Consulting
- Helped implement Confidential and quality initiatives for international SAP manufacturing client.
- Developed and delivered training on Informatica team based development, Informatica PowerCenter GRID, and Informatica Metadata Manager, as well as Informatica Data Explorer (data profiling) and Informatica Data Quality.
- Contracted again in 2009 by Confidential for Teradata Station Casinos Project.
Senior Data Architect
- With this system, the client was enabled to automate the intake of over 300 digital sources of data from all over the world.
- Roles including designing project plan, designing technical solution and developing and implementing XML to relational data model in Microsoft SQL Server. Automated ETL was developed by the team using Microsoft SSIS.
- Developed Informatica PowerCenter upgrade guidelines and then guided and supported a team of ten Informatica developers upgrading over 167 separate financial ETL systems from PowerCenter v7.1.2 to PowerCenter 8.1.1.
- Used Metadata Manager and SharePoint to automate project completion reporting.
- Exeros Discovery Studio Proof-Of-Concept for an international financial services firm in White Plains NY, 12/2007.
- Helped show methods to quickly identify data file and field content and quality.
Data Warehouse Architect
- Designed a financial credit scoring data warehouse application. Solution was implemented very quickly using Confidential PowerDesigner 12 and Oracle.
Data Warehouse Quality Assurance Consultant
- Ensured a complete, quality data warehouse solution be delivered by the Prime Contractor for the Confidential on this initial Financial, Payroll and Human Resources Data Warehouse implementation for the Finance Division.
- Broadened scope of work to include a role dedicated to quality assurance improved my skills as a project manager and data warehouse architect. Identified and alerted client of several pending design issues and suggest potential changes, thereby saving all parties time and money.
Senior Data Quality and Data Profiling Lead
- Led US rollout of initial Informatica Data Quality and Informatica Data Explorer (data profiling) solutions for Professional Services while interfacing with Informatica and Similarity (Dublin) R&D teams.
- Initial clients included Major League Baseball Advanced Media, Affymetrix, Novation, and others; before rollout I also developed internal Change Data Capture (CDC) training for Professional Services staff.
Data Warehouse Project Manager, Technical Lead
- Developed Confidential system of a metadata data warehouse to allow better tracking of all education website and publication content, author and versioning for the Agency.
- Performed GAP analysis of current situation, developed a prototype, and then implemented final solution using Informatica PowerCenter and Confidential Cognos ReportNet.
- Contract was extended to assume leadership of another project for Confidential to implement the first statewide web based reporting and analysis of K-12 test score analysis. Project brought the first computer based analysis abilities to some of the school districts of the State.
Data Warehouse Manager
- Served as Team Manager for a data warehouse project during a time of transition and discovery; uncovered technical and architectural challenges and initiated targeted efforts to correct key issues.
Data Warehouse Project Manager, Designer, Technical Lead
- Replaced 3-year failed data warehouse for HUD in which the metrics and design needed simplification and improved accuracy.
- Designed and led staff in the successful eight-month deployment of a Phase 1 data warehouse spanning six of eleven Real Estate business areas of the Department.
- Served as Project Manager, Technical Architect and one of the developers for this project that used Informatica PowerCenter, Oracle, Microsoft SQL Server, ERwin and MicroStrategy across both UNIX and Windows platforms.
- Data warehouse architect performing design, development and training to new ETL and Medicaid data warehouse staff members of Confidential Corporation.
- Tasks included new code development and analysis, systems research and verification, and documentation of the Informatica Metadata used in the Medicaid project, including source to target mappings of data fields.
- Trained 21 new Informatica ETL Developers from existing COBOL mainframe development staff.
Senior Solutions Architect
- Public Sector Pre-Sales Architect providing project and solution designs for clients and prospects.
- Specialized in visual geo-mapping and metric reporting of data.
- Developed prototype solutions in Homeland Security, Finance, Education, Fire & Safety, Court Administration and other areas of government.
Data Warehouse Project Architect
- Performed project troubleshooting for a new data warehouse project at Wolverine WorldWide that had become stalled due to technical data warehouse design and tool issues.
- Identified issues and designed plan in first month on-site, and then helped lead the team to a successful first implementation only 60 days later, after the implementation of Informatica PowerCenter (v3.5).
- Reverse engineered a custom AS/400 RPG financials application and a not yet completed BAAN ERP implementation together to form a successful Hyperion Essbase application to allow Fanuc Robotics to perform pre- year-end trail close operations while their BAAN ERP was not yet fully implemented.
Data Warehouse Consultant
- Consultant for a Confidential Corporation data warehouse based reporting project.
- Designed first web mapping sites used to track vehicle distribution from manufacturing plants in North America dealer and storage lots across USA.
- Performed database and data warehouse project management and project troubleshooting; also worked with PowerDesigner, the Confidential /SAP design suite.
- Developed Oracle to Confidential SQL Translation Coding Guide for a consulting firm.