Etl Architect And Development Team Lead Resume
SUMMARY
- Over 17 Years of Experience on various Operating systems, Development environments with extensive knowledge of finance, Insurance, HR, Banking and call center domains.
- Over 11 years of Experience as a Solution Architect, Team lead, SME and Business/Systems Analyst.
- Over 10 years of Professional Training experience with excellent written & verbal communication skills.
- OLTP and OLAP Applications on UNIX, Windows and Mainframe systems.
- Design, Architecture and maintenance of huge Data warehouses.
- Big Data Architecture, Big Data/Hadoop implementations, Hadoop Administration, NoSQL/HBase/Cassandra.
- Designing Hadoop HDFS File System with Strong knowledge of Hbase, Hive and Pig configuration.
- Working with Hadoop ecosystem tools Hive, Pig, Sqoop, Map - Reduce and Flume.
- Oracle, DB2, SQL Server, Hive analytical functions, Netezza and Unix Shell.
- SQL, PL/SQL, SQL*Loader, SQL*Net, NZSql and NZLoad.
- Database and ETL Performance tuning and using flat files, XML data files and mainframe datasets in ETL.
- ETL tools AbInitio, Informatica Power Center, Cognos Data Manager and SSIS and implementing ETL test processes.
- Data analysis and BI tools Cognos, BO, Micro strategy and SSRS.
- Strong work experience in Web based application development, Database programming, distributed computing, Server side programming and Client Server computing in multi-threaded software systems using Java technologies
- Rational tools Clear case, Clear quest and Designer / Modeling / Mapping tools like Erwin and Power Designer and UML
- Scheduling tools ControlM, CA7, Autosys and ESP in Cybermation and Version manager tools Clear case, PVCS and EME.
- SDLC (System design Life cycle) through AGILE and waterfall Methodologies.
- PMLC and Project implementation using SDM (System design Methodology) and DSS (Decision Support System).
TECHNICAL SKILLS
Big data: HDFS architecture, Hadoop ecosystem, Hive, Pig, Hbase, Mango DB, Sqoop, Map reduce and Flume
Database: Oracle 7.x, 8.x, 9.x, 10g, DB2 UDB V8.1 / 9.5, SQL Server, Netezza
Languages: UNIX Shell, SQL, Db2 SQL PL, Oracle PL/SQL, Hive, Pig, Java, Perl, SQL*Loader, NzLoad
Operating Systems: UNIX, Windows NT, 2000 Server
Data warehouse: Db2, Oracle, Teradata, Netezza
ETL and OLAP: AbInitio, Informatica Power Center, BO, SSIS, SSRS, Micro strategy, Cognos BI, Cognos Data Manager
ERP: ORACLEApplications 10.x and 11.i Financial Modules
Scheduling Tools: AbInitio EME, Tidal, Informatica workflow manager, SSIS, Cybermation ESP, CA7, Autosys, ControlM
Design modeling tools: Erwin, UML, VISIO, Power Designer, MS Word, MS Excel
GUI and Other tools: ORACLE Forms and Reports, Rational UML, Clear case, Clear quest, TOAD, SQL navigator
PROFESSIONAL EXPERIENCE
Confidential
ETL Architect and development team lead
Environment: Oracle, Toad, LINUX, Shell script, Java, Mainframe, Ab Initio ETL, Hadoop Hive, Pig, Sqoop, Map-Reduce and Flume, SSIS, XML, MVS, SAP BW, MS Visio, ESP, Power Designer, UML
Responsibilities:
- Architecture, design and implementation of Hadoop ecosystem HDFS, Hbase, Hive, Pig on Linux.
- Trained analysts to use Hadoop Hive and Pig for analysis of new trends to recommend customers in the investment areas.
- Trained team of developers to use Map reduce to create meaning full and structured data from un structured data and to load the structured data into Oracle databases using Sqoop.
- Impact analysis of the existing programs to perform when all the customers need to be migrated to the new Fusion system.
- ETL Configuration and Data Mapping from Source to Target using Power Designer.
- End-to-end ETL designs for data loading into Business specific Data marts and Multi-Dimensional Cubes for Analysis.
- Interpreting and applying business rules to design transformations and parameterize graphs for re usability.
- Designed AbInitio Custom reusable sub graphs using various components.
- Created end to end ETL design in Abinitio for extraction, validation and loading into Business specific Data marts and Multi-Dimensional Cubes for BI Reporting and Analysis.
- Use case (UML) for validation and rollback steps for Data warehouse and ETL to achieve end to end architecture solution.
- Scheduled Jobs and created job dependencies using Cybermations ESP (Mainframe based).
- Trained the Offshore team and supported the off shore / Onsite model.
- Involved in hands on development in Hadoop for Big data using Hive and Pig.
- Interpreting and applying business rules to design transformations and parameterize mappings for re usability.
- Designed Custom reusable sub graphs using various components which can make the development effort to build new modules easier.
Confidential
ETL Architect and development team lead
Environment: Oracle, Toad, LINUX, Shell script, Java, Mainframe, Abinitio ETL, SSIS, XML, MVS, SAP BW, MS Visio, ESP, Power Designer, UML
Responsibilities:
- Created SRS (Software requirement specification) and perform Business analysis to translate SRS to SDS (Software development Specification).
- Performed impact and data gap analysis of the existing programs, gathered new changes in the DWH.
- Interacted with the Business team and Stake holder to understand the requirements to create SRS (Software requirement specification) and perform Business analysis to translate SRS to SDS (Software development Specification).
- Performed impact analysis of the existing programs, gathered new changes in the DWH to perform impact analysis.
- Created Data Mapping from Source to Target using SQL Tools and Power Designer.
- Performed data gap analysis by gathering data elements required to perform data gap analysis.
- Created end-to-end ETL process design for data extraction, validation and loading into Detica warehouse, Business specific Data marts and Multi-Dimensional Cubes for BI Reporting and Analysis.
- Created detailed implementation plans by creating Use case (UML) for validation and rollback steps.
- Provided with Data warehouse and ETL level architecture solutions to achieve end to end solution architecture.
- Interpreting and applying business rules to design transformations and parameterize graphs for re usability.
- Designed AbInitio Custom reusable sub graphs using various components.
- Created end to end ETL design in Abinitio for extraction, validation and loading into Business specific Data marts and Multi-Dimensional Cubes for BI Reporting and Analysis.
- Analyzed Workflow monitor to monitor tasks / workflows and also to monitor performance.
- Scheduled Job flows and Error handling using SSIS packages through BIDS (Business Intelligence Development Studio).
- Trained the Offshore team and supporting the off shore / Onsite model.
- Monitor offshore and onsite teams and providing the developers with adequate Knowledge transfer as and when needed along with design and architecture.
- Interpreting and applying business rules to design transformations and parameterize mappings for re usability.
- Designed Custom reusable maplets using transformations like Joiner, Aggregate, Router, Expression, Filter etc.
- Followed the entire SDLC to Review Code and unit test results, support validate rollout from DEVL to QA to PROD environments.
- Followed the basic principles of PMLC (project Management Life Cycle) to Partner with other technical leads to ensure that all deliverables are aligned and scheduled to deliver as per target.
Confidential, PA
ETL Architect and development team lead
Environment: Oracle, Toad, LINUX, Shell script, Java, Mainframe, Abinitio ETL, Hadoop Hive, Pig, Sqoop, Map-Reduce and Flume, SSIS, XML, MVS, SAP BW, MS Visio, ESP, Power Designer, UML
Responsibilities:
- Provide consulting to customers to identify Big Data use cases and guiding them towards implementation of use cases.
- Big Data strategic planning, technology roadmap, talent acquisitions.
- Mentor team for cutting edge technology competitiveness such as Hadoop, Hive, Pig HBase, big data analytics.
- Train Big Data team on different Big Data Solutions for multiple customers and verticals.
- Infrastructure setup, capacity planning and administration for customers multi node Apache Hadoop.
- Responsible for the design, implementation and coding of J2EE technology based applications .using XML, Java, JDBC, Oracle and JSP.
- Performed impact and data gap analysis of the existing programs, gathered new changes in the DWH
- Created end-to-end ETL process design and detailed implementation plans with validation and rollback steps.
- Provided training for Data warehouse and ETL solutions to load into business specific Multi-Dimensional Cubes for BI
- Trained and monitored offshore team in supporting the off shore / Onsite model.
- Interpreting and applying business rules to design transformations and parameterize ETL code for re usability.
- Partner with other technical leads to ensure that all deliverables are aligned and scheduled to delivered as per target
- Worked on Training the Offshore team in scheduling tools to schedule individual and batch jobs.
Confidential, NY
ETL Architect and development team lead
Environment: Microsoft office, MS Visio, MS Project, UML, Oracle, DB2, Toad, LINUX, Shell script, Java, Mainframe, Informatica power center 8.6, 9.0, XML, MVS, Cognos BI, Tidal
Responsibilities:
- Interaction with the Informatics team) to understand the requirements and perform Business analysis to create requirements.
- Performed impact analysis of the existing programs and recommended new changes to the DWH through (SDS).
- Created Use cases using UML and analyzed business requirements and converted them into system requirements.
- Worked through 2 projects implementing a complete SDLCs’ by following the best practices of PMLC
- Created end-to-end ETL implementation plans with validation and rollback steps for data extraction, validation and loading.
- Worked on Power Center Designer tools and capabilities to run Pre-existing and debug sessions to monitor and test the sessions prior to their normal run in the Workflow Manager and Designed pre-session and post session scripts in mappings.
- Designed and developed Informatica Mappings, Re-usable Transformations and Mapplets for data loads.
- Interpreting and applying business rules to design transformations and parameterize mappings for re usability.
- Set up Batches and sessions to schedule the loads at required frequency using Power Center Workflow manager.
- Created end-to-end ETL process design for loading into Business specific Data marts and Multi-Dimensional Cubes.
- Development using Core Java, Servlets, PL/SQL Developer, JSP, HTML, XML, Eclipse
- Deliver Technical Design documents with conversion & Interface Process Flow Diagrams and objects
- Support interface Objects through Unit Testing, System Test & rollout.
- Monitor onsite team providing them with adequate Knowledge transfer along with design and architecture.
- Review Code and unit test results, support validate rollout from DEV to QA and to PROD environments.
- Partner with other technical leads to ensure that all deliverables are aligned, scheduled and are on target, and delivered.
- Worked on Tidal scheduling tool along with Mainframe.
- Worked with Quality Center to track defects in Unit Testing, INTG, and UACC for various projects to make ensure and create the dashboards that the code is promoted correctly to reduce slippage and issues.
Confidential, PA
ETL Architect and development team lead
Environment: Abinitio Gde & Co op, Oracle, Netezza UNIX, Shell script, Java, Mainframe, Informatica power center 8.6, XML, MVS, Cognos BI, Cybermation ESP, Lotus notes, Microsoft office, Visio, UML
Responsibilities:
- Created end-to-end ETL process design for data extraction, validation and loading.
- Created detailed implementation plans with validation and rollback steps.
- Provided with Data warehouse and ETL level architecture solutions to achieve end to end solution architecture.
- Responsible for using the capabilities of Informatica Power Center.
- Created and ran Pre-existing and debug sessions to monitor and test the normal run process in Workflow Manager
- Used Workflow monitor to monitor tasks, workflows and also to monitor performance
- Written pre-session and post session scripts in mappings
- Developed Informatica Mappings, Re-usable Transformations and Mapplets to load to data warehouse and database (oracle)
- Set up Batches and sessions to schedule the loads at required frequency using Power Center Workflow manager.
- Deliver Technical Design and Create Conversion & Interfaces.Process Flow Diagrams.
- Provided solution architecture and design for Dynamic Query builder process to generate queries in Oracle and Netezza.
- Monitor offshore and onsite teams and providing them with adequate Knowledge transfer with design and architecture. to load data from a variety of sources, files and various databases.
- Interpreting and applying business rules to design transformations and parameterize graphs for re usability.
- Designed AbInitio Custom reusable sub graphs using various components.
- Created end to end ETL design in Abinitio for extraction, validation and loading into Business specific Data marts and Multi-Dimensional Cubes for BI Reporting and Analysis.
- Review Code and unit test results, support validate rollout from DEVL to INTG to UACC to PROD environments.
- Partner with other technical leads to ensure that all deliverables are aligned, scheduled and are on target, and delivered.
- Worked on Cybermation ESP for scheduling along with Mainframe JCL.
- Worked with Quality Center to track defects in Unit Testing, INTG, and UACC for various projects to make ensure and create the dashboards that the code is promoted correctly to reduce slippage and issues.
- Provided training sessions to other groups to maximize the re use of existing code among various ongoing projects
Confidential
ETL Architect and development team lead
Environment: Db2, SQL server, UNIX, Shell script, Mainframe, Win Sql, Abinitio ETL V1-15 GDE and V2-15 Co-op, XML, MVS, Cognos
Responsibilities:
- Supporting projects in using Ab Initio for ETL, perform as-is Analysis to fulfill requirements of Cross functional teams.
- Profile Source & target data, facilitate mapping work sessions to map in scope attributes and create mapping document with approved Confidential ’s SDLC Methodology processes and techniques.
- Responsibilities include Designing and Creating end-to-end ETL processes, data extraction and validation.
- Deliver Technical Design Documents and Create Conversion & Interfaces Process Flow Diagrams and objects relevant Interface documentation and Support interface Objects through Unit Testing, System Test & rollout.
- Worked with Ab Initio GDE in extracting and transforming data from a variety of sources like MVS Files, Flat files, XML Files, MS SQL Server databases, interpreting and applying business rules in transformations and parameterize graphs.
- Created Ab Initio Custom common graphs using components like Join, FTP, Rollup, Partition/De-partition, Gather, Normalize, De-Normalize, Reformat, Merge, join with DB, Dedup Sorted, Scan, Fuse and Validate.
- Review Code and unit test results, support validate rollout from Dev to sys to PAT to Production environments.
- Created detailed implementation plans with validation and rollback steps.
- Partner with other technical leads to ensure that all deliverables are aligned, scheduled and are on target, and delivered.
- Manage Quality Central for the Ab Initio Projects to track defects in Unit Testing, SIT, UAT for various projects to make ensure and create the dashboards that the code is promoted correctly to reduce slippage and issues.
Confidential
BA, ETL architect, Development lead, SME, DWH / ETL Architect
Environment: Oracle 9.x, 10g, SQL server UNIX, Shell script, MS SQL Server, .NET, PL/SQL, PL/SQL Developer, Abinitio, SQL Navigator, XML, BO
Responsibilities:
- Worked in Agile environment with time-boxed iterative requirements gathering, development and testing with communication and collaboration with cross functional teams at all levels by adaptive planning.
- Reverse engineer the business process and identify the Meta data entities to create the source to target mapping document.
- Design, build, unit test performance oriented SQL queries and PL/SQL packages,
- Designed ETL transformations using Abinitio and documenting Implementation of maps and load routines for reporting of Finance Data Mart and integrating it with Global Data Warehouse.
- Created design for transformations and complex graphs for data loading from relational and flat file sources into targets
- Designed and developed various mappings and mapplets to extract data from SQL Server and load to Oracle database
- Extracted the source definitions from various relational sources like Oracle, DB2 and CSV files.
- Developed Transformation logic and designed various complex data transformation.
- Prepared code documentation in support of program development and wrote detailed design specifications.
- Investigate data quality issues, monitor and evaluate trends, and formulate corrective action.
- Oversee code builds and deployments adhering to change management standards.
- Participated in Code Reviews and code migrations like Pre and Post Certifications using Unix Automated Scripts and support QA & Prod.
- Co-ordinate with support groups in minimizing outage due to execution failures.
- Designed and created new PL\SQL Packages that has business logic to support MAPD Business
- Created new interfaces to downstream systems using PL/SQL Packages, Shell Scripts
- Analyze and gather target application design requirements and Work Structures and change data structure according to target application (Oracle).
- Involved in gathering requirements from Business analysts and Corporate Managers and mapping those requirements into design specifications.
- Performance tuning of mappings. Creating functional and Unit test cases, executing the same.
- Tasks also include unit / integration / system testing, Documentation, QA support and handover Training.
Confidential
DB2 UDB/UNIX Ab Initio Development lead
Environment: DB2 UDB on UNIX, UNIX Shell script, Ab Initio ETL, PVCS Version Manager.
Responsibilities:
- DB2 database using SQL PL stored Procedures, Functions and Packages.
- Analyze and gather target application design requirements and Work Structures and change data structure according to target application (DB2).
- Involved in gathering requirements from Business analysts and Corporate Managers and mapping those requirements into design specifications.
- Implemented 1 complete SDLC life cycles using SDM and DSS methodology.
- Designed and developed processes in Ab Initio to Extract data from flat files, Transform data in the flat files and to Load and reload the data into tables.
- Designed the ETL process in Ab Initio for loading the staging tables.
- Created DB2 Procedures for Loading/inserting data into the EDW database to achieve an optimal balance between performance and availability.
- Created DB2 Procedures for bulk loading of records from batch files and continuous insertion of single records from near real-time feeds and snapshots.
- Created SQL Scripts with queries to initiate sample data for testing propose.
- Extensively used the DB2 built-in functions and Unix Shell functions for parallel processing wherever possible to improve the performance.
- Developed and modified several UNIX shell programs to invoke the AbInitio scripts to load the data into staging tables and invoke the stored procedures to load the data into EDW production tables for parallel loading.
- Developed and modified Several Unix Shell programs, DB2 SQL PL programs, Stored Procedures to perform the Pre - Inspection, Post - Inspection, Extraction processes to retrieve data primarily from the database and optionally from flat files.
- Invoked stored Procedures from the Parallel loader Shell script through batch process.
- Tasks also include unit / integration / system testing, Documentation, QA support and transition Training.
- Created shell scripts for automating source data validation and data quality checking.