Etl Architect Resume
California Los Angeles, US
SUMMARY
- Confidential has 20+ plus years, experienced and trained to transform high volume of data into standardized and actionable business information.
- Experienced in working with business users to turn business needs into creative technology solutions, He has exposure in business intelligence, data integration, data discovery, data profiling and data quality.
- Confidential has hands on experience in BI, Bigdata, and Cloud data integration. His current role as a BI & AI Technical Architect working on healthcare/energy & utility/Confidential customers.
- He is responsible for the realization of data Integration, Data quality, Data Security, Data Governance and Master Data Management and providing viable architectural solutions & managing efforts from conceptualization thru offshore development to project delivers. As an ETL Technical Architect, leverage thorough understanding of business and system processes to make recommendations and propose technical/non - technical solutions to meet business requirements.Played lead role in shaping and enhancing the overall ETL architecture and related technologies.
- Collaborate with leadership teams on the increasing scalability and robustness of the ETL platforms and solutions.
- He establish & enforce standards and design patterns within data integration tools & technologies, recommend opportunities for reuse of data where appropriate.
- His design and work efforts for ensuring security, administration, capacity planning and performance tuning of the ETL platforms & ensured that the ETL solution and platforms meet service level requirements.
- He handled over 2 decades of data warehouse projects development success implementing Bill Inmon, Kimball, expert in dimensional, relational, hybrid and related design.
TECHNICAL SKILLS
Big Data / Analytics: Cloudra, Spark, Impala, Hive, R Programing, Python, Azure ML, Cortana Intelligence Suite
Database(s): Oracle 11g, Sybase IQ, SQL Server 7.0,2008, MySQL3.42, FoxPro, MS Access. SQLite, Teradata 15
4 GL: Oracle SQL PL/SQL, SQL Server 2010, SQL, T-SQL, Oracle Golden Gate
ETL Tools / Scheduler: Informatica 10.1, Ab Initio, DAC 7.8.4. Autosys CA, Appworx, Redwood
Data Migration Tools: Oracle Migration Work Bench1.2, SwissSQL. Microsoft DTServices, AWS - DMS
Reporting Tools: Business Object 10, OBIEE 10.2, Siebel Analytics 7.8, Crystal Report, Data Report.
Configuration Management: StarTeam5.0, CM/Synergy 6.2, Visual Source Safe 6.0,Perforce,SVN, Appworx, Perforce, Bit bucket
Project Management: SDLC, Agile, XP, VVT, ITIL
Unix shell: Korn Shell scripting, AWK Scripting.
Data Security: Protegrity, Oracle Wallet
PROFESSIONAL EXPERIENCE
ETL Architect
Responsibilities:
- Collaborated with business and technical leads to understand the key business and architectural requirements, and defined the appropriate ETL/ELT and data solution and architecture & ensured that it meet QA/business needs.
- Involved design and development activities such as source to target mapping, job flows, code reviews, deliverables reviews, test planning.
- Conducted impact assessment and provide effort estimates based on business & technical requirements
- Guided & coach and mentor development teams & specialists on ETL best practices, design patterns and decision rules
- Analysis come up with system stability plan and implemented into PROD for better system performance in ETL PROD data Load.
- Provided project estimates, coordinated the development efforts and discussions with the business partners, updated status reports and handled logistics to enable smooth functioning of the project and meet all the deadlines.
ETL Architect
California, Los Angeles, US
Responsibilities:
- Collaborated with business and technical leads to understand the key business and architectural requirements, and defined the appropriate ETL/ELT and data solution and architecture & ensured that it meet QA/business needs.
- Involved design and development activities such as source to target mapping, job flows, code reviews, deliverables reviews, test planning.
- Conducted impact assessment and provide effort estimates based on business & technical requirements
- Guided & coach and mentor development teams & specialists on ETL best practices, design patterns and decision rules
- Analysis come up with system stability plan and implemented into PROD for better system performance in ETL PROD data Load.
- Provided project estimates, coordinated the development efforts and discussions with the business partners, updated status reports and handled logistics to enable smooth functioning of the project and meet all the deadlines.
Data Architect / Lead
Confidential, California, US
Responsibilities:
- Architect, Design and Develop all ETL/ELT components including data acquisition, cleansing, standardization, validation, staging, database persistence, and job execution processes.
- Delivered technical architecture various outbound feeds sourcing information from AWS/Heroku Cloud.
- Developed an architecture document and designed data models and a data flow processes to support the reporting needs.
- Actively engaged with various external teams to obtain agreements on various matters pertaining to infrastructure setup, design decisions, architectural approach and others.
- Lead a team and was actively involved in solution design, data mapping activities and supporting the ETL team during development of inbound & outbound interfaces.
- Design and build of the Common Data Platform in AWS/Heroku Cloud. The Common Data Platform host and sync-up data from various PG&E and non-PG&E systems on batch and real time mode wherever possible.
- Customer Data Platform will provide a canonical model Reusable Data Store from which data can be consumed by various Salesforce Orgs and other PG&E on premise systems.
- Common Data Platform sync-up data from systems listed under Integration/Data exchange - Applications and System.
- Solutions provided the Data Migration Strategy and implemented the Prod & Non Prod environment.
- Provided project estimates, coordinated the development efforts and discussions with the business partners, updated status reports and handled logistics to enable smooth functioning of the project and meet all the deadlines
ETL Architect/lead
Confidential, California, US
Responsibilities:
- Design and Architect the E2E system.
- Design and developed the data algorithms for data Patten analysis & filling the missing interval gaps.
- Provided project estimates, coordinated the development efforts and discussions with the business partners, updated status reports and handled logistics to enable smooth functioning of the project and meet all the deadlines.
ETL Architect/lead
Confidential, California, US
Responsibilities:
- Designed and developed Fast Load and MLoad scripts in control file, developed BTEQ scripts to process the data in staging server.
- Designed and developed physical models with appropriate Primary, Secondary, PPI and Join Index taking into consideration of both planned access of data and even distribution of data across all the available AMPs.
- Designed and developed workflows and automated with UNIX shell scripts and ESP. Created shell scripts to drop and re-create indexes to load data efficiently and decrease load time
- Involve in the complete life cycle of quite a few Data warehouse implementations which include System Architecture, Functional/Technical Design, Development - Coding and Testing (Unit Testing, System Integration Testing and User Acceptance Testing) and Implementation.
- Provided project estimates, coordinated the development efforts and discussions with the business partners, updated status reports and handled logistics to enable smooth functioning of the project and meet all the deadlines. Required to creation of SQL scripts for balancing financial data between Teradata systems with Parallel testing procedures.
Technical Project Manager
Confidential, New York, US
Responsibilities:
- Design ETL Architecture & incorporated the Data Profiling, Standardization, Identifying Anomalies, creating/applying rules and mappings/mapplets, Performed End-to-end Data Lineage using Metadata Manager.
- Performed technical assessment for 14 years legacy systems & identified various issues under, process, data security, and architecture. Provided technical leadership & optimum solution to resolve those issues.
- Designed and implemented Data validation framework and it helps to identify the Data quality issues across the OpsRisk downstream(s).
- Performed Data Profiling, Standardization, Identifying Anomalies, creating/applying rules and mappings/mapplets, Performed End-to-end Data Lineage using Metadata Manager
- Identified Top 45 performance Deficiency session and tuned. It helped to bring down the system from 10 Hrs refresh time to 04 Hrs. Introduced utPLSQL Unit testing for functional/unit and code coverage for testing.
- Introduced Debug framework to identify the performance issues in oracle stored procedure level.
- Involved & upgraded Informatica from 8.1 to 9.1 and supported Sun Solaris to Linux migration.
- Support Systems Integration Testing, User Acceptance Testing and Defect Management.
- Responsible for project deliverables which includes Design & Development/Code Review/Version control/QA/UAT & Production Deployment.
Technologies: Informatica 9.1, Business Object 10, Oracle 11g,Java, Unix, Linux, AutosysProject & ClientOMD Migration , Barclays Capital, New York, US
ETL Technical Architect
Confidential, New York, US
Responsibilities:
- Managed and created the scope, cost, schedule, and contractual deliverables through the application of planning, tracking, quality assurance, change control, and risk management.
- Responsible for project deliverables which includes Design/Data Mapping document, Quality Assurance, Deployment Plan.
- Design & developed standard set of Framework for core asset class migration from Sybase IQ to SQL Server.
- Involved in Database capacity plan & Designed daily file partitions with cluster & non cluster index concepts for database performance.
- Involved in Technical Leadership & Team management for this project & delivered excellence.
- Guided to team for Informatica workflow/mapping/SQL performance tunings.
- Provided Data load strategy for Historical Data Load into PROD. Achieved 0% defect density which were components moved PROD.
- MQ Series configuration on Linux server & guiding data read concepts in Informatica (UAT/LIVE).
- Equipped & self-trained with ETL data comparison testing with using third party tool & identified scope of the tool and specially designed and integrated automated regression suite for testing.
ETL Technical Architect
Confidential, New York, US
Responsibilities:
- Architect, Design and Develop all ETL components including data acquisition, cleansing, standardization, validation, staging, database persistence, and job execution processes.
- Had done design & developed standard set of Framework for core system migration from VBA to Teradata .
- Designed & developed Informatica Mapplets to fulfill their project requirements. Derived overall project with a concept of Meta data drive approach, which will reduce the maintained cost in future.
- Needed to control defect density expected to achieve 0% on overall E2E Development. Involved in design and development of ETL and SharePoint data comparisons tools & automated this testing activity, which saved considerable amount of time/Cost/Resourcing.
- Once asset classes went to LIVE, users had performed User Regression Testing for 30 working days. There were Zero defects from the asset classes. We achieved 0% Defect Density.
Technical Lead
Confidential
Responsibilities:
- Responsible for team management, co-ordination with onsite team, prepare project plan, allocation of work, tracking of tasks and efforts.
- Responsible for technical leadership/code review to a development team. This includes the analysis of use cases and business requirements, involved & implemented various major/ minor releases.
- Responsible for project deliverables which includes Design/Data Mapping document, Deployment Plan with source codes, Configuration Management, UAT Cases and regular update to Run Book/ KT/System Documents.
- Design & coded for flexibility, maintainability, ease of support and operational robustness. Drive reuse by using components such as mapplets and worklets.
- Analysis designed the RPD for Data modeling (Star Schema, Snowflake) as per requirements & business rules.
- Coordinated with release and application support teams to ensure that the solution get deployed smoothly into the QA and Production environments.
- Also defined support protocols and standard operating procedures. Closely worked with the reporting team to improve query performance for the reports running in OLAP
- Involved in Oracle 10g upgrades on production Environment. Identified strategy for data acquisition and archival on production Environment. Attend SCRUM/delivery calls with Development and VV & T Team.
Confidential
Responsibilities:
- Design & coded for flexibility, maintainability, ease of support and operational robustness. Drive reuse by using components such as mapplets and worklets.
- Coordinated with release and application support teams to ensure that the solutions deployed smoothly into the QA and Production environments.
- Defined support protocols and standard operating procedures.
- Providing technical leadership/code review to a development team. This includes the analysis of use cases and requirements, design and implementation of major/ minor releases.
- Designed automated/Scheduled the Log backups & stop & restart servers & Dashboard updates.
- Trouble shoot performance issues with huge volume of data
- Worked with the reporting team to improve query performance for the reports running in OLAP DB.
- Involved in Oracle 10g upgrades on production Environment.
- Identified strategy for data acquisition and archival.
- Attend SCRUM/delivery calls with Development and VV & T Team.